WO2009071526A1 - Improved display of multiple overlaid objects - Google Patents

Improved display of multiple overlaid objects Download PDF

Info

Publication number
WO2009071526A1
WO2009071526A1 PCT/EP2008/066584 EP2008066584W WO2009071526A1 WO 2009071526 A1 WO2009071526 A1 WO 2009071526A1 EP 2008066584 W EP2008066584 W EP 2008066584W WO 2009071526 A1 WO2009071526 A1 WO 2009071526A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
pixels
traced
identified
neighbouring
Prior art date
Application number
PCT/EP2008/066584
Other languages
French (fr)
Inventor
Edward Oakeley
Original Assignee
Novartis Forschungsstiftung, Zweigniederlassung
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Novartis Forschungsstiftung, Zweigniederlassung filed Critical Novartis Forschungsstiftung, Zweigniederlassung
Publication of WO2009071526A1 publication Critical patent/WO2009071526A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Definitions

  • the present invention relates to an improved algorithm for edge detection and determining the path of interconnected objects, hence providing an improved display of multiple overlaid objects.
  • the present invention relates to a method for allowing easy imaging of objects overlaid upon other objects.
  • the invention relates to a process for tracing and displaying neuron paths.
  • Neurons may have branched dendrite structures and be highly interconnected. It is desirable to be able to extract and/or view individual neurons or neuron paths from an image or an image stack of overlapping neurons.
  • the present invention provides a method for the identification and the tracing of individual objects from a pixelated image containing at least one object said method comprising propagating a trace from a pixel of origin situated within the edges of the object to be identified and traced, or from a last traced pixel, by determining if pixels neighbouring said pixel of origin, or last traced pixel, are also part of the object to be identified and traced, and propagating the trace to neighbouring pixels found to be part of the object to be identified and traced, wherein determining if a neighbouring pixel is part of the object to be identified and traced is performed by (a) constructing an area around said pixel of origin, or last traced pixel, said area comprising at least one additional pixel in every direction, (b) constructing an area around each of the untraced pixels neighbouring said pixel of origin, or last traced pixel, (neighbouring pixel) said area comprising the same number of additional pixels in every direction as in the area constructed in step
  • the present invention provides a method for the identification and the tracing of individual objects from a pixelated image containing at least one object said method comprising propagating a trace from a pixel of origin situated within the edges of the object to be identified and traced, or from a last traced pixel, by determining if pixels neighbouring said pixel of origin, or last traced pixel, are also part of the object to be identified and traced, and propagating the trace to neighbouring pixels found to be part of the object to be identified and traced, wherein determining if a neighbouring pixel is part of the object to be identified and traced is performed by (a) constructing an area around each of the untraced pixels neighbouring said pixel of origin, or last traced pixel, (neighbouring pixel), said area comprising at least one additional pixel in every direction, (b) calculating a respective standard deviation value of the intensity of all the pixels of the area constructed in step (a) around each of the untraced neighbouring pixels, (c) normal
  • the object to be identified and traced in the methods of the invention described herein-above is contained in a stack of at least three pixelated images and wherein the areas of steps (a) to (c), for instance a square of 3x3 pixels, are replaced by volumes, for instance a cube of 3x3x3 pixels.
  • the object to be identified a traced is a neuron.
  • a preferred method of the invention which is a method for the identification and the tracing of individual, possibly bifurcating, objects from a pixelated image containing at least one possibly bifurcating object, and possibly containing multiple overlaid, possibly bifurcating objects, said method comprising the steps of (i) defining a first circle having the pixel of origin as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object, and wherein said pixel of origin is situated within the edges of the object to be identified and traced, (ii) propagating a trace from the centre of the first circle to neighbouring pixels according to the method of any of - A -
  • the object to be identified and traced is contained in a stack of at least three pixelated images and wherein the circles and perimeters thereof of steps (i) to (vii) are replaced by spheres and surfaces thereof.
  • the radius r is set at step (i) to be equal to or greater than the mean local edge distance from the centre of the circle or sphere, respectively, when following the shortest one-dimensional path to the edge, or the mean distance from the respective integer median pixel to the patch edge.
  • the radius r can be set at step (i) to be twice the mean local edge distance from the centre of the circle or sphere, respectively.
  • the methods of the invention are performed on a device capable of performing the required calculations, for instance a computer.
  • the present invention hence also provides a computer program comprising code means which when executed on a computer perform all the steps of the respective methods of the invention described herein-above.
  • Another aspect of the present invention is a computer-readable medium comprising embedded or recorded thereon a computer program comprising code means which when executed on a computer perform all the steps of the respective methods of the invention described herein-above.
  • an aspect of the present invention is a system for identifying an object to be identified and traced in an image stack, comprising code means which when executed on a computer perform all the steps of the respective methods of the invention described herein- above; means for tracing the object; means for differentiating between branches of the object to be identified and other objects; and means for highlighting the object to be identified.
  • the objects may be neurons.
  • the display may be adapted to receive a pixelated image stack or to convert a series of images into a pixelated image stack.
  • the display may be adapted to determine or receive a start point on the object or object path to be traced.
  • the means for tracing may be adapted to section the image into at least one trace area.
  • the trace area may be a sphere having radius r.
  • the means for tracing may include means for identifying pixels representing objects.
  • the means for tracing may be adapted to perform edge detection on at least one of the objects.
  • the means for tracing may be adapted to propagate a trace from the start point on the object to be identified by determining if neighbouring pixels represent an object and continuing to propagate the trace from those pixels that are determined to represent an object until there are no untraced neighbouring pixels to trace along and/or the boundary of a trace area has been reached.
  • the means for tracing may be adapted to calculate if an intensity of a pixel and/or pixel noise is above or below a threshold value.
  • the pixel noise may be calculated by determining the standard deviation of intensities of predetermined pixels around the pixel whose pixel noise is being calculated and dividing this by an intensity of the pixel whose noise is being calculated.
  • the predetermined pixels may be a 3x3x3 cube of pixels having the pixel whose noise is being calculated in the centre.
  • the means for tracing may be adapted to assign pixels having an intensity and/or noise above the threshold value as being a pixel on the edge of a cell.
  • the means for tracing may be adapted to transfer coordinates of pixels of the input image stack that have been determined to represent the object to be identified to an output image stack.
  • the means for differentiating between branches of the object to be identified and other objects may be adapted to determine patches of pixels, a patch of pixels being defined as a collection of pixels lying on the boundary of the trace area, the pixels representing a part of an object, and in which every pixel in the patch contacts at least one other pixel in the patch. In the case of three dimensional images, the trace area is a local sphere.
  • the means for differentiating between branches of the object to be identified and other objects may be adapted to determine if each patch of pixels represents the traced object or a different object and assigning a new start point in each patch of pixels belonging to the object to be traced.
  • the means for tracing may be adapted to trace the object in one trace area at a time and to continue the trace by generating a new trace area associated with each patch of pixels that has been determined to belong to the object to be traced.
  • the means for differentiating between branches of the object to be identified and other objects may be adapted to determine if there is only one untraced patch on a trace area and if so, continue tracing in the direction of the untraced patch centre.
  • the means for differentiating between branches of the object to be identified and other objects may be adapted to determine if an untraced patch centre lies substantially opposite the centre of the trace area from an already traced patch and if so, continue the tracing only in the direction of this untraced patch centre.
  • a third circle or sphere will be constructed with a radius R' that is bigger than R.
  • the grandchildren patch centres will then be determined for each of the child patch centres.
  • the parent patch centre will only be considered valid for trace propagation if the child and grandchild patch centres lie substantially on a straight line
  • the means for differentiating between branches of the object to be identified and other objects may be adapted to determine both patches and child (or alternative) patches for a trace area being a sphere of radius r (where in the case of neurons, "r" can be defined as the radius of the neuron), a child patch being a patch lying on a sphere of radius 2r around the centre of the trace area.
  • the means for differentiating is further adapted to determine if two non-origin patches on a trace area boundary and two child patches lie substantially on a straight line, and if so not continuing tracing in the direction of these patches.
  • the means for identifying the origin of a patch is achieved by generating a GUID based on the for all pixels traced at the same moment.
  • the means for differentiating may be adapted to interpolate a first line between two non- origin patches on a trace area boundary and interpolating a second line between two child patches and continuing tracing in the direction of these patches only if the perpendicular distance between the first and second lines is greater than a branch value.
  • the branch value may be 0.2r, where r is the radius of the trace area.
  • a third circle or sphere will be constructed with a radius R' that is bigger than R.
  • the grandchildren patch centres will then be determined for each of the child patch centres.
  • the parent patch centre will only be considered valid for trace propagation if the child and grandchild patch centres lie substantially on a straight line
  • the means for highlighting the object to be identified may be a display showing an object extracted from the initial image.
  • the means for highlighting may be a display adapted to colour the object to be identified in a colour differing from that of the rest of the objects.
  • the Z-dimension may be represented through colour intensity where layers that are further away from the observer may be darker and those closer to the observer may be lighter.
  • a further aspect of the present invention is a method for identifying an object in an image stack using one of the methods of the invention described herein-above, comprising sectioning the image into at least one trace area having a boundary; tracing the object; differentiating between branches of the object to be identified and other objects at the boundary of each trace area.
  • the objects may be neurons.
  • the method may include receiving a pixelated image stack or to converting a series of image into a pixelated image stack.
  • the method may further comprise determining or receiving a start point on the object or object path to be traced.
  • the trace area(s) may be spheres having radius r.
  • Tracing the object may include identifying pixels representing objects. Tracing the object may include performing edge detection on at least one of the objects. Tracing the object may include propagating a trace from the start point on the object to be identified by determining if neighbouring pixels represent an object and continuing to propagate the trace from those pixels that are determined to represent an object until there are no untraced neighbouring pixels to trace along and/or the boundary of a trace area has been reached.
  • Tracing the object may further comprise calculating if an intensity of a pixel and/or pixel noise is above or below a threshold value.
  • the pixel noise may be calculated by determining the standard deviation of intensities of predetermined pixels around the pixel whose pixel noise is being calculated and dividing this by an intensity of the pixel whose noise is being calculated.
  • the predetermined pixels may be a 3x3x3 cube of pixels having the pixel whose noise is being calculated in the centre.
  • Tracing the object may include assigning pixels having an intensity and/or noise above the threshold value as being a pixel on the edge of a cell.
  • the method may further comprise transferring the coordinates of pixels of the input image that have been determined to represent the object to be identified to an output image.
  • Each pixel found to represent the object to be identified within any trace area may be assigned a GUID code corresponding to that trace area.
  • Differentiating between branches of the object to be identified and other objects may comprise determining patches of pixels, a patch of pixels being a collection of pixels lying on the boundary of the trace area, the pixels representing a part of an object, and in which every pixel in the patch contacts at least one other pixel in the patch.
  • the method may further comprise determining if each patch of pixels represents the traced object or a different object and assigning a new start point in each patch of pixels belonging to the object to be traced.
  • the identification of patches may include determining the centre of each patch, the coordinates of the centre being defined by calculating the median value of the X, Y and Z axes for the patch.
  • the median may ensure that the patch centre lies on the surface of the trace area.
  • the new start point associated with a patch may be its patch centre.
  • differentiating between branches of the object to be identified and other objects may include determining if an non-origin patch centre lies substantially opposite the start point from an already traced patch and if so, continuing tracing only in the direction of this untraced patch centre.
  • differentiating between branches of the object to be identified and other objects may include determining patches for a trace area, being a sphere of radius r, and child patches, a child patch being a patch lying on a sphere of radius 2r around the centre of the trace area.
  • the differentiating method may include determining if two non-origin patches on a trace area boundary and two child patches lie substantially on a straight line, and if so, not continuing tracing in the direction of these patches.
  • the differentiating method may include interpolating a first line between two untraced patches on a trace area boundary and interpolating a second line between two child patches and continuing tracing in the direction of these patches only if the perpendicular distance between the first and second lines is greater than a branch value.
  • the branch value may be 0.2r, where r is the radius of the trace area.
  • the method may include highlighting the object to be identified by displaying the object extracted from the initial image.
  • the method may include colouring the object to be identified in a colour differing from that of the rest of the objects.
  • the method may include converting pixel coordinates from a global frame of reference to a local frame of reference for each trace area.
  • the local frame of reference may be a cube centred on the centre of the trace area (the start point for that trace area).
  • the local coordinate frame may be a cube having coordinates running from (0, 0, 0) to (2r, 2r, 2r).
  • a further aspect of the present invention is a method for tracing an object in an image according to the methods of the invention described herein-above, comprising sectioning the image into at least one trace area, performing edge detection on the object, propagating a trace from a start point on the object by determining if neighbouring pixels represent the object and continuing to propagate the trace from those pixels that are determined to represent the object until there are no untraced neighbouring pixels to trace along and/or the boundary of a trace area has been reached and/or the edge of the object has been reached.
  • Another aspect of the present invention is a method of performing edge detection using the methods of the invention described herein-above, comprising calculating if an intensity of a pixel and/or pixel noise is above or below a threshold value, wherein the pixel noise is calculated by determining the standard deviation of the intensity of predetermined pixels around the pixel whose pixel noise is being calculated and dividing this by the intensity of the pixel whose noise is being calculated; and assigning pixels having an intensity and/or noise above the threshold value as being a pixel on the edge of an object.
  • the predetermined pixels may be a 3x3x3 cube of pixels having the pixel whose noise is being calculated in the centre.
  • According to another aspect of the present invention is a method for differentiating between branches of an object in an image from other objects using the methods of the invention described herein-above, comprising sectioning the image into at least one trace area and tracing the path of an object in at least one of the trace area(s); determining patches of pixels, a patch of pixels being a collection of pixels that represent a part of an object that lies on the boundary of the trace area in which every pixel in the patch contacts at least one other pixel in the patch; determining if there is only one untraced patch on a trace area boundary and if so, designating the untraced patch centre as belonging to the object; and/or determining if an untraced patch centre lies substantially opposite the start point from an already traced patch and if so, designating the untraced patch centre at belonging to the object and/or determining child patches for a trace area, a child patch being a patch lying on a sphere of radius 2r around the centre of the trace area, determining if two un
  • the method for differentiating between branches of an object in an image from other objects may additionally or alternately include interpolating a first line between two non-origin patches on a trace area boundary and interpolating a second line between two child patches and designating the patches as belonging to the object only if the perpendicular distance between the first and second lines is greater than a branch value.
  • the branch value may be 0.2r, where r is the radius of the trace area.
  • the term "significant” means either statistically significant as determined by e.g. a Welch T-test where a p value will be less that 0.05, or a difference which is above an absolute threshold, for example 1.5-, 2-, 3-, 4- or 5-fold.
  • Figure 1 is a schematic of the display apparatus of an embodiment of the present invention.
  • Figure 2 is an image of a sample of neurons
  • Figure 3 is an initial output image
  • Figure 4 shows selection of an origin on a neuron in the image of Figure 2;
  • Figure 5 shows generation of a spherical trace area around the origin of Figure 4.
  • Figure 6 shows a transformation into an alternative frame of reference for calculations relating to the sphere of Figure 5;
  • Figure 7 shows the frame of reference of Figure 6, containing the results of a neighbour fill
  • Figure 8 shows the results of the neighbour fill shown in Figure 7, transposed into the output frame of Figure 3;
  • Figure 9 shows two patch centres resulting from the neighbour fill shown in Figure 7;
  • Figure 10 shows the determination of three patch centres in an alternate neighbour fill
  • Figure 11 is a schematic of the process of differentiating between a neuron being traced and a crossing neuron;
  • Figure 12 shows two valid patch centres on a neuron being traced and rejection of a crossing neuron in an alternate neighbour fill
  • Figure 13 is a schematic of the determination of whether or not two patch centres relate to a crossing neuron or branches of the neuron being traced;
  • Figure 14 is a display showing an output image highlighting the selected neuron.
  • Figure 1 shows an image processing and display apparatus 5 for tracing and highlighting a selected neuron or neuron path from an image 10 of neurons.
  • the apparatus 5 includes memory 15 for storing input and output image data and any variables used in the processing of the image 10, a processor 20 for processing the image 10 and a display 25, such as a LCD or cathode ray tube monitor, or other output device such as a hard disk, printer, communications link, further memory means and the like, for outputting the processed image 30.
  • the image processing and display apparatus 5 contains means for receiving an image such as a communications link, disk drive or memory stick/card/chip/device reader.
  • the apparatus may form a stand-alone station for post processing an image obtained from a separate image collection apparatus.
  • the image processing and display apparatus may be integral with the image collection device. This allows real time imaging and tracing of neurons, which affords greater opportunity to adjust image collection parameters, viewing conditions, imaging angles, equipment setup etc. in order to best extract the required information during the imaging session.
  • Figure 2 shows an input image 10 of a sample of neuron cells 35, 40.
  • the image may be an image produced by any neuron image collection means known in the art such as a fluorescence microscope.
  • the image is either digital or digitised such that it is pixelated.
  • the image shows a 2D x,y plane slice of a 3D image stack formed by many such images in the z- plane. For clarity, only an x, y plane image is shown.
  • the input image 10 may be considered as an array of pixels, with each pixel being assigned a value in accordance to its brightness or luminance.
  • the image stack may be represented mathematically as a 3D matrix of data, the data representing for example, the coordinates of pixels, the brightness or luminescence of pixels or colour of pixels.
  • the image 10 is processed to determine which pixels represent neurons 35, 40 and which do not. In one embodiment of the present invention, this involves first calculating a pixel noise value for each pixel in the image having an intensity/brightness above a certain limit.
  • Determining the noise value for a pixel involves defining a 3x3x3 pixel cube whose central pixel is the pixel whose noise value is to be determined.
  • the pixel noise value is calculated by determining the standard deviation of the intensity of the 27 pixels in the contact cube and dividing it by the absolute intensity of the pixel under investigation. Hence, these steps provide a rate of change of intensity. Moreover, within a neuron the intensity is fairly uniform. Outside a neuron, the intensity is also fairly uniform. However, at the edge the standard deviation becomes important (briefly) and the place where this occurs marks the edge. Thus, the pixel noise is used to determine which pixels lie on the edges 45 of a neuron 35, 40.
  • the system assigns pixels having a high pixel noise (above an assigned pixel noise value) as edge pixels and those 50 having a low pixel noise as non-edge. Providing the initial start position lies within a neuron then the observed high variation boundary will mark the edge of the neuron. A second filter may be applied to check the local absolute pixel intensity does not drop below a given threshold to reduce the danger of escaping from a "dark" neuron into the surrounding area.
  • the value of pixel noise above which a pixel is determined as being on the edge 45 of a neuron is selectable by a user and depends on a variety of factors such as the overall intensity of the image 10, the type of cells being imaged, the image collection conditions and apparatus used.
  • a further intensity threshold is set that will define an edge 45, even if the pixel noise threshold is not reached.
  • An output image or image stack 30 is then created that has the same dimensions as the input image or image stack 10.
  • the output image or image stack 30 is initially blank, as shown in Figure 3.
  • a spherical trace area 65 of predefined radius r having the start pixel 60 at its origin is determined, as illustrated in Figure 5.
  • the radius r may be set by the user, or determined automatically, and may be twice the local radius of the neuronal process being traced.
  • an alternative embodiment of the methods of the invention makes use of local coordinates. Once local coordinates have been calculated, this information can be mapped to the region being traced using addition/subtraction rather than trigonometry. This embodiment will indicate whether a pixel lies less than "r" pixels from the trace centre or not. This embodiment is hence much faster than calculating the coordinates of a spherical shell for every trace area but gives a similar answer.
  • a bounding frame 70 having opposing corners defined as (0, 0, 0) and (2r, 2r, 2r), as shown in Figure 6, is calculated to enclose the spherical trace area 65.
  • the global coordinates (with respect to the input image 10) of the pixels within the boundary frame 70 are converted to local coordinates with respect to the boundary frame 70, i.e. the frame of reference has been transposed.
  • the angles of the trace can be pre-calculated. These angles can be used to determine where the lines go, hence reducing the number of trigonometric calculations used in the tracing loop. Hence, to further speed up the calculation process, the pixels lying on the surface of the spherical trace area 65 and the angles they make to the centre 60 of the spherical trace area 65 can be determined at this point. As shown in Figure 7, a neighbour fill is then carried out starting from the start pixel 60. This involves checking the status variable for each of the pixels neighbouring the start pixel. Those neighbouring pixels having a status value of "true" are stored in a boundary frame data class. The process is repeated for the neighbours of each of the first set of neighbouring pixels 50 whose status value is "true” until all the "true” pixels 50 have been discovered or the boundary 75 of the trace area 65 has been reached.
  • the pixels 50 from the trace area 65 having a status value of "true” are transformed from local coordinates in the bounding frame 70 to coordinates of the output image 30 and transferred to the output image 30, as shown in Figure 8.
  • a GUID value is generated for the trace area 65 and assigned to each of the pixels 50 within the trace area having a status value of "true".
  • the GUID (Globally Unique Identifier) value may be one of 2 128 combinations (128-bit number).
  • the next area of the image 10 that the tracing process is to be applied is determined.
  • the centre 80c of each patch 80 is then determined. This is determined by finding the median value for the X, Y and Z coordinates in the patch 80.
  • each of the patch centres 80, 80' is considered to be a new start pixel 60.
  • new trace areas 65 having the new start pixel 60 at the centre of the trace area 65 are determined.
  • Figure 10 shows the determination of a trace vector 95, which indicates a direction of movement of the tracing process.
  • the trace vector 95 (in the case of second and subsequent trace areas) is obtained by determining which of the patches 85 on a trace area has a GUID the same as that of the new start point 60 (i.e. was traced in the same trace area 65 as the start point 60).
  • the centre 85c of the patch 85 having the same GUID as the new start point 60 i.e. the centre of the trace area 65
  • the vector from the origin 85c to the start point 65 (origin) of the trace area 65 is the trace vector 95.
  • each patch centre 80, 90, 90' belongs to the neuron being traced 35 or to a different neuron 40.
  • This process is illustrated in Figure 11-13. If there is a patch centre 80 substantially opposite the start pixel 60 from the origin 85c, i.e. the angle from the origin 80c to the centre point of the trace area (start pixel 60) matches or is within, for example, ⁇ 0.1 radians of the angle from the centre point (start pixel 60) to another patch centre 80c as shown in Figure 12, then only the patch centre 80c opposite the origin 85c is used as the new start pixel 60, any other patch centres 90, 90' for that trace area 65 are ignored and tracing continues accordingly.
  • this patch centre 80 becomes the new start pixel 60 and tracing continues based up this.
  • a child patch centre 105c, 105c' is a patch centre that lies on a sphere 110 having radius of 2r around the start pixel 60 of the original trace area 65.
  • Whether or not this exclusion applies can be determined by interpolating a first line 120 through the two non-origin patch centres 100c, lOOc'and interpolating a second line 125 through a child patch centre 105c, 105c' for each of the parent patch centres 100, 100' respectively. If the perpendicular separation 130 of these two lines 120, 125 is greater than 20% of r, then the patches 100, 100'represent a valid bifurcation of the neuron 35 and both patch centres 100c and 100c' become new start pixels 60. If the separation of the two lines 120, 125 is less than 20% of r, then both patches 100 and 100' are designated as belonging to another neuron 40 and ignored. In other words, there are two spheres both with the same centre.
  • One (small) has a radius "r" the other (big) "2r”.
  • the object e.g. neuron, is traced from the middle of the small sphere down the paths that either do not have GUIDs set or different GUIDs from the centre.
  • the trace reaches the edge of the small sphere new patch centres are calculated. For each of these the trace continues until the edge of the big sphere is reached. Since the spheres are quite small, there is not much space for it to split between the edge of the small and the big sphere but if it did then each of the big sphere patches descended from a single small sphere patch would be used to calculate the angles. If it passes the filters then this split will be investigated in the next cycle of tracing
  • whether or not the straight line exclusion applies can be determined by calculating an angle (a) between a first patch centre 100c, the origin 60 and a child patch centre 105c corresponding to the first patch centre 100c; an angle (a') between a second patch centre 100c', the origin 60 and a child patch centre 105c' corresponding to the second patch centre 100c'; an angle (b) between the first 100c and second 105c' patch centres and the origin 60 and an angle (c) between the child centres 105c, 105c'.
  • the first and second patches 100, 100' are determined to correspond to a different neuron 40 crossing the neuron being traced 35. In this case, the first and second patches 100, 100' are ignored. However, if a is out with 0.1 radians of a' or c is out with 0.1 radians of the sum of a + a' + b then both patch centres 100 and 100' are assigned as new start points 60 and the tracing process continues down these branches as described above.
  • the tracing process including determining new trace areas 65 for each valid patch centre 85c and performing neighbour fills of each trace area 65, is repeated until the trace areas 65 of all the valid patch centres 80c have been traced and the appropriate pixels having a status value of "true" have been transferred to the output image 30.
  • the output image 30 shows only the selected neuron 35. Any other neurons 40, including those that cross the selected neuron 35 are filtered out from the original image 10.
  • the colour of pixels associated with the output image 30 may be changed and the output image 30 superimposed upon the original image 10 in order to highlight the traced neuron 35 over the other neurons 40 in the original image 10.
  • the above method could also be used to extract objects, e.g. trees, or people from movie sequences by replacing the z-stack information of the 3D neuron images by a series of images in the time domain.
  • objects e.g. trees
  • the edge detection and cross point detection techniques described above may be used to identify a person or object, e.g. tree, from a complex background. The defined person or object may then be highlighted or removed from the movie sequence or subjected to other processing.

Abstract

The present invention relates to an improved algorithm for edge detectio\ n and determining the path of interconnected objects, e.g. neurons, hence providing an imp\ roved display of multiple overlaid objects. In particular, the present invention relates \ to a method for allowing easy imaging of objects overlaid upon other objects.

Description

Improved Display of multiple overlaid objects
Object of the Invention
The present invention relates to an improved algorithm for edge detection and determining the path of interconnected objects, hence providing an improved display of multiple overlaid objects. In particular, the present invention relates to a method for allowing easy imaging of objects overlaid upon other objects. In one embodiment, the invention relates to a process for tracing and displaying neuron paths.
Background
The identification of individual objects from an image containing multiple overlaid objects presents a significant challenge to human operators and particularly to the automation of the process. This is especially challenging when the objects are irregularly shaped or contain branches, as it is often difficult to differentiate between branches and different overlaid objects. Lack of resolution or contrast in images exacerbates this problem. An example of such a data set would be an image stack through a group of neurons, such as those found in a brain section.
Neurons may have branched dendrite structures and be highly interconnected. It is desirable to be able to extract and/or view individual neurons or neuron paths from an image or an image stack of overlapping neurons.
Description of Invention
In order to fulfil this need, the present invention provides a method for the identification and the tracing of individual objects from a pixelated image containing at least one object said method comprising propagating a trace from a pixel of origin situated within the edges of the object to be identified and traced, or from a last traced pixel, by determining if pixels neighbouring said pixel of origin, or last traced pixel, are also part of the object to be identified and traced, and propagating the trace to neighbouring pixels found to be part of the object to be identified and traced, wherein determining if a neighbouring pixel is part of the object to be identified and traced is performed by (a) constructing an area around said pixel of origin, or last traced pixel, said area comprising at least one additional pixel in every direction, (b) constructing an area around each of the untraced pixels neighbouring said pixel of origin, or last traced pixel, (neighbouring pixel) said area comprising the same number of additional pixels in every direction as in the area constructed in step (a), (c) calculating a respective standard deviation of the intensity of all the pixels of the area for each of the areas constructed in steps (a) and (b), optionally normalising said standard deviation, preferably by dividing said standard deviation by the intensity of the pixel of origin, or the pixel being traced respectively, wherein a significant increase, or a value above a pre-defined threshold, for instance 0.01 , 0.02, 0.03, 0.04, 0.05, 0.06, 0.08 or 0.1 , of the standard deviation calculated for the pixels of the area constructed in step (b) around a particular neighbouring pixel as compared to the standard deviation calculated for the pixels of the area constructed in step
(a) around said pixel of origin, or last traced pixel, indicates that said particular neighbouring pixel is part of the edge of the object to be identified and traced and is therefore not part of the object to be identified and traced, whereas a similar value, or a value below a pre-defined threshold, for the standard deviation calculated for the pixels of the area constructed in step
(b) around a particular neighbouring pixel as compared to the standard deviation calculated for the pixels of the area constructed in step (a) around said pixel of origin, or last traced pixel, indicates that said particular neighbouring pixel is part of the object to be identified and traced, (d) propagating the trace to the untraced neighbouring pixels which have been determined in step (c) to be part of the object to be identified and traced, and (e) repeating steps (a) to (d) for each of the pixels traced in step (d) until there are no untraced neighbouring pixels identified as part of the object to be identified and traced and/or until the boundary of another already traced area has been reached.
In an alternative embodiment, the present invention provides a method for the identification and the tracing of individual objects from a pixelated image containing at least one object said method comprising propagating a trace from a pixel of origin situated within the edges of the object to be identified and traced, or from a last traced pixel, by determining if pixels neighbouring said pixel of origin, or last traced pixel, are also part of the object to be identified and traced, and propagating the trace to neighbouring pixels found to be part of the object to be identified and traced, wherein determining if a neighbouring pixel is part of the object to be identified and traced is performed by (a) constructing an area around each of the untraced pixels neighbouring said pixel of origin, or last traced pixel, (neighbouring pixel), said area comprising at least one additional pixel in every direction, (b) calculating a respective standard deviation value of the intensity of all the pixels of the area constructed in step (a) around each of the untraced neighbouring pixels, (c) normalising each of the standard deviation value calculated in step (b), by dividing said standard deviation value by the intensity of the untraced neighbouring pixel around which the area has been constructed in step (a), hence obtaining a normalised standard deviation value for each of the untraced neighbouring pixels, wherein a normalised standard deviation value above a pre-defined threshold, for example 0.01 , 0.02, 0.03, 0.04, 0.05, 0.06, 0.08 or 0.1 , indicates that the untraced neighbouring pixel around which the area has been constructed in step (a) is part of the edge of the object to be identified and traced and is therefore not part of the object to be identified and traced, whereas a normalised standard deviation value below or equal to a pre-defined threshold indicates that said particular untraced neighbouring pixel around which the area has been constructed in step (a) is part of the object to be identified and traced, (d) propagating the trace to the untraced neighbouring pixels which have been determined in step (c) to be part of the object to be identified and traced, and (e) repeating steps (a) to (d) for each of the pixels traced in step (d) until there are no untraced neighbouring pixels identified as part of the object to be identified and traced and/or until the boundary of another already traced area has been reached.
In preferred embodiments of the invention, the object to be identified and traced in the methods of the invention described herein-above is contained in a stack of at least three pixelated images and wherein the areas of steps (a) to (c), for instance a square of 3x3 pixels, are replaced by volumes, for instance a cube of 3x3x3 pixels.
In a most preferred embodiment of the invention, the object to be identified a traced is a neuron.
The methods described herein-above allow a preferred method of the invention, which is a method for the identification and the tracing of individual, possibly bifurcating, objects from a pixelated image containing at least one possibly bifurcating object, and possibly containing multiple overlaid, possibly bifurcating objects, said method comprising the steps of (i) defining a first circle having the pixel of origin as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object, and wherein said pixel of origin is situated within the edges of the object to be identified and traced, (ii) propagating a trace from the centre of the first circle to neighbouring pixels according to the method of any of - A -
claims 1 to 4 until the trace reaches the perimeter of said first circle and until all the pixels which are part of the object to be identified and traced and within the surface of said first circle have been traced, wherein all the pixels which are part of the object to be identified and traced and are also part of the perimeter of said first circle are considered to be a patch, (iii) attributing an identical identifier, for instance a globally unique identifier (GUID), a universally unique identifier (UUID) or a coordinated universal time (UTC) stamp with nanosecond or smaller precision, to all the pixels traced in step (ii) not yet having an identifier, thereby producing identified pixels, (iv) calculating the coordinates of the integer median pixel of each patch present on the perimeter of the first circle, (v) assessing the number of patches or of integer median pixels present on the perimeter of the first circle, wherein (vi) in the cases where there are only one, two or three patches, or integer median pixels, defining a new circle having the respective integer median pixel of origin as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object, for each of the integer median pixels which have been identified in step (iii), or (vi') in the cases where there are more than three patches, or integer median pixels, defining a new circle having the pixel of origin as centre and a radius of R, wherein R is not equal to r, wherein R is bigger than the local radius of the possibly bifurcating object, and wherein R is chosen so that after repeating steps (ii) and (iv), thus obtaining for each of the previous patches or integer median pixels an alternative patch or integer median pixel present on the perimeter of said circle having a radius of R, the same amount of alternative patches or integer median pixels is present as compared to the number of patches obtained in step (iii) or integer median pixels calculated in step (iv), respectively, assessing whether each pair of integer median pixels and alternative integer median pixels has a counterpart lying on a straight line, wherein a straight- line is defined as when the sum of the angles in radians of the triangles formed by the centre of the circles and the integer median pixels and alternative integer median pixels is between or equal to π-x and π, where x is a predefined threshold, for example 1 % π, 2% π, 5% π or 10% π, or as when the line joining an integer median pixel and its corresponding alternative integer median pixel when extended in both direction until it traverses on the other side of the circle having a radius of R has a square of the distance between itself and a similar line originating from another integer median pixel and the alternative integer median pixel from said other integer median pixel that is below a pre-defined threshold, for instance an 8-bit intensity of 20, and (1 ) wherein any unidentified integer median pixel lying on a straight-line with regard to the integer median pixel identified in step (iii) is a new start point for defining a new circle having said new start point as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object, (2) wherein any unidentified integer median pixel lying on a straight-line with regard to another unidentified integer median pixel is rejected and therefore is not a new start point, and (3) wherein any unidentified integer median pixel which has not been rejected under step (2) is also considered to be a new start point for defining a new circle having said new start point as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object, and (vii) repeating steps (ii) to (vii), wherein in steps (ii) to (v) the new circles having a radius of r are used instead of the first circle, and the integer median pixels are used instead of the pixel of origin.
In a preferred embodiment of the above method, the object to be identified and traced is contained in a stack of at least three pixelated images and wherein the circles and perimeters thereof of steps (i) to (vii) are replaced by spheres and surfaces thereof.
In preferred embodiments of the above methods of the invention, the radius r is set at step (i) to be equal to or greater than the mean local edge distance from the centre of the circle or sphere, respectively, when following the shortest one-dimensional path to the edge, or the mean distance from the respective integer median pixel to the patch edge. For example, the radius r can be set at step (i) to be twice the mean local edge distance from the centre of the circle or sphere, respectively.
Most preferably, the methods of the invention are performed on a device capable of performing the required calculations, for instance a computer.
The present invention hence also provides a computer program comprising code means which when executed on a computer perform all the steps of the respective methods of the invention described herein-above.
Another aspect of the present invention is a computer-readable medium comprising embedded or recorded thereon a computer program comprising code means which when executed on a computer perform all the steps of the respective methods of the invention described herein-above.
Accordingly, an aspect of the present invention is a system for identifying an object to be identified and traced in an image stack, comprising code means which when executed on a computer perform all the steps of the respective methods of the invention described herein- above; means for tracing the object; means for differentiating between branches of the object to be identified and other objects; and means for highlighting the object to be identified.
In the different aspects of the invention described herein-above, the objects may be neurons. The display may be adapted to receive a pixelated image stack or to convert a series of images into a pixelated image stack. The display may be adapted to determine or receive a start point on the object or object path to be traced.
The means for tracing may be adapted to section the image into at least one trace area. The trace area may be a sphere having radius r.
The means for tracing may include means for identifying pixels representing objects. The means for tracing may be adapted to perform edge detection on at least one of the objects. The means for tracing may be adapted to propagate a trace from the start point on the object to be identified by determining if neighbouring pixels represent an object and continuing to propagate the trace from those pixels that are determined to represent an object until there are no untraced neighbouring pixels to trace along and/or the boundary of a trace area has been reached.
The means for tracing may be adapted to calculate if an intensity of a pixel and/or pixel noise is above or below a threshold value. The pixel noise may be calculated by determining the standard deviation of intensities of predetermined pixels around the pixel whose pixel noise is being calculated and dividing this by an intensity of the pixel whose noise is being calculated. The predetermined pixels may be a 3x3x3 cube of pixels having the pixel whose noise is being calculated in the centre. The means for tracing may be adapted to assign pixels having an intensity and/or noise above the threshold value as being a pixel on the edge of a cell.
The means for tracing may be adapted to transfer coordinates of pixels of the input image stack that have been determined to represent the object to be identified to an output image stack.
The means for differentiating between branches of the object to be identified and other objects may be adapted to determine patches of pixels, a patch of pixels being defined as a collection of pixels lying on the boundary of the trace area, the pixels representing a part of an object, and in which every pixel in the patch contacts at least one other pixel in the patch. In the case of three dimensional images, the trace area is a local sphere. The means for differentiating between branches of the object to be identified and other objects may be adapted to determine if each patch of pixels represents the traced object or a different object and assigning a new start point in each patch of pixels belonging to the object to be traced.
The means for tracing may be adapted to trace the object in one trace area at a time and to continue the trace by generating a new trace area associated with each patch of pixels that has been determined to belong to the object to be traced.
The means for differentiating between branches of the object to be identified and other objects may be adapted to determine if there is only one untraced patch on a trace area and if so, continue tracing in the direction of the untraced patch centre.
Alternatively or additionally, the means for differentiating between branches of the object to be identified and other objects may be adapted to determine if an untraced patch centre lies substantially opposite the centre of the trace area from an already traced patch and if so, continue the tracing only in the direction of this untraced patch centre.
Alternatively or additionally, in the case where one parent patch gives rise to two or more child patches a third circle or sphere will be constructed with a radius R' that is bigger than R. The grandchildren patch centres will then be determined for each of the child patch centres. In this case the parent patch centre will only be considered valid for trace propagation if the child and grandchild patch centres lie substantially on a straight line
Alternatively or additionally, the means for differentiating between branches of the object to be identified and other objects may be adapted to determine both patches and child (or alternative) patches for a trace area being a sphere of radius r (where in the case of neurons, "r" can be defined as the radius of the neuron), a child patch being a patch lying on a sphere of radius 2r around the centre of the trace area. The means for differentiating is further adapted to determine if two non-origin patches on a trace area boundary and two child patches lie substantially on a straight line, and if so not continuing tracing in the direction of these patches. The means for identifying the origin of a patch is achieved by generating a GUID based on the for all pixels traced at the same moment. In this way pixels that have already been traced will have a different GUID from pixels that are currently being traced. The means for differentiating may be adapted to interpolate a first line between two non- origin patches on a trace area boundary and interpolating a second line between two child patches and continuing tracing in the direction of these patches only if the perpendicular distance between the first and second lines is greater than a branch value. The branch value may be 0.2r, where r is the radius of the trace area.
Alternatively or additionally, in the case where one parent patch gives rise to two or more child patches a third circle or sphere will be constructed with a radius R' that is bigger than R. The grandchildren patch centres will then be determined for each of the child patch centres. In this case the parent patch centre will only be considered valid for trace propagation if the child and grandchild patch centres lie substantially on a straight line
The means for highlighting the object to be identified may be a display showing an object extracted from the initial image. The means for highlighting may be a display adapted to colour the object to be identified in a colour differing from that of the rest of the objects. On a 2D representation of the output image stack, the Z-dimension may be represented through colour intensity where layers that are further away from the observer may be darker and those closer to the observer may be lighter.
According to a further aspect of the present invention is a method for identifying an object in an image stack using one of the methods of the invention described herein-above, comprising sectioning the image into at least one trace area having a boundary; tracing the object; differentiating between branches of the object to be identified and other objects at the boundary of each trace area.
The objects may be neurons. The method may include receiving a pixelated image stack or to converting a series of image into a pixelated image stack. The method may further comprise determining or receiving a start point on the object or object path to be traced. The trace area(s) may be spheres having radius r.
Tracing the object may include identifying pixels representing objects. Tracing the object may include performing edge detection on at least one of the objects. Tracing the object may include propagating a trace from the start point on the object to be identified by determining if neighbouring pixels represent an object and continuing to propagate the trace from those pixels that are determined to represent an object until there are no untraced neighbouring pixels to trace along and/or the boundary of a trace area has been reached.
Tracing the object may further comprise calculating if an intensity of a pixel and/or pixel noise is above or below a threshold value. The pixel noise may be calculated by determining the standard deviation of intensities of predetermined pixels around the pixel whose pixel noise is being calculated and dividing this by an intensity of the pixel whose noise is being calculated. The predetermined pixels may be a 3x3x3 cube of pixels having the pixel whose noise is being calculated in the centre. Tracing the object may include assigning pixels having an intensity and/or noise above the threshold value as being a pixel on the edge of a cell.
The method may further comprise transferring the coordinates of pixels of the input image that have been determined to represent the object to be identified to an output image. Each pixel found to represent the object to be identified within any trace area may be assigned a GUID code corresponding to that trace area.
Differentiating between branches of the object to be identified and other objects may comprise determining patches of pixels, a patch of pixels being a collection of pixels lying on the boundary of the trace area, the pixels representing a part of an object, and in which every pixel in the patch contacts at least one other pixel in the patch.
The method may further comprise determining if each patch of pixels represents the traced object or a different object and assigning a new start point in each patch of pixels belonging to the object to be traced.
The identification of patches may include determining the centre of each patch, the coordinates of the centre being defined by calculating the median value of the X, Y and Z axes for the patch. The median may ensure that the patch centre lies on the surface of the trace area. The new start point associated with a patch may be its patch centre.
Tracing the object may include tracing the object in one trace area at a time and continuing the trace by generating a new trace area associated with each patch of pixels that has been determined to belong to the object to be traced. Differentiating between branches of the object to be identified and other objects may include determining if there is only one non-origin patch on a trace area and if so, continue tracing in the direction of the untraced patch centre.
Alternatively or additionally, differentiating between branches of the object to be identified and other objects may include determining if an non-origin patch centre lies substantially opposite the start point from an already traced patch and if so, continuing tracing only in the direction of this untraced patch centre.
Alternatively or additionally, differentiating between branches of the object to be identified and other objects may include determining patches for a trace area, being a sphere of radius r, and child patches, a child patch being a patch lying on a sphere of radius 2r around the centre of the trace area. The differentiating method may include determining if two non-origin patches on a trace area boundary and two child patches lie substantially on a straight line, and if so, not continuing tracing in the direction of these patches. The differentiating method may include interpolating a first line between two untraced patches on a trace area boundary and interpolating a second line between two child patches and continuing tracing in the direction of these patches only if the perpendicular distance between the first and second lines is greater than a branch value. The branch value may be 0.2r, where r is the radius of the trace area.
The method may include highlighting the object to be identified by displaying the object extracted from the initial image. The method may include colouring the object to be identified in a colour differing from that of the rest of the objects.
The method may include converting pixel coordinates from a global frame of reference to a local frame of reference for each trace area. The local frame of reference may be a cube centred on the centre of the trace area (the start point for that trace area). The local coordinate frame may be a cube having coordinates running from (0, 0, 0) to (2r, 2r, 2r).
According to yet a further aspect of the present invention is a method for tracing an object in an image according to the methods of the invention described herein-above, comprising sectioning the image into at least one trace area, performing edge detection on the object, propagating a trace from a start point on the object by determining if neighbouring pixels represent the object and continuing to propagate the trace from those pixels that are determined to represent the object until there are no untraced neighbouring pixels to trace along and/or the boundary of a trace area has been reached and/or the edge of the object has been reached.
Another aspect of the present invention is a method of performing edge detection using the methods of the invention described herein-above, comprising calculating if an intensity of a pixel and/or pixel noise is above or below a threshold value, wherein the pixel noise is calculated by determining the standard deviation of the intensity of predetermined pixels around the pixel whose pixel noise is being calculated and dividing this by the intensity of the pixel whose noise is being calculated; and assigning pixels having an intensity and/or noise above the threshold value as being a pixel on the edge of an object.
The predetermined pixels may be a 3x3x3 cube of pixels having the pixel whose noise is being calculated in the centre.
According to another aspect of the present invention is a method for differentiating between branches of an object in an image from other objects using the methods of the invention described herein-above, comprising sectioning the image into at least one trace area and tracing the path of an object in at least one of the trace area(s); determining patches of pixels, a patch of pixels being a collection of pixels that represent a part of an object that lies on the boundary of the trace area in which every pixel in the patch contacts at least one other pixel in the patch; determining if there is only one untraced patch on a trace area boundary and if so, designating the untraced patch centre as belonging to the object; and/or determining if an untraced patch centre lies substantially opposite the start point from an already traced patch and if so, designating the untraced patch centre at belonging to the object and/or determining child patches for a trace area, a child patch being a patch lying on a sphere of radius 2r around the centre of the trace area, determining if two untraced patches on a trace area boundary and two child patches lie substantially on a straight line, and if so designating these patches as not belonging to the object.
The method for differentiating between branches of an object in an image from other objects may additionally or alternately include interpolating a first line between two non-origin patches on a trace area boundary and interpolating a second line between two child patches and designating the patches as belonging to the object only if the perpendicular distance between the first and second lines is greater than a branch value. The branch value may be 0.2r, where r is the radius of the trace area.
For the purpose of the present invention, the term "significant" means either statistically significant as determined by e.g. a Welch T-test where a p value will be less that 0.05, or a difference which is above an absolute threshold, for example 1.5-, 2-, 3-, 4- or 5-fold.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of the present invention, suitable methods and materials are described below. All publications, patent applications, patents, and other references mentioned herein are incorporated by reference in their entirety. In case of conflict, the present specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and not intended to be limiting.
Description of the Drawings
Various aspects of the invention will now be described by way of example only and with reference to the accompanying drawings of which:
Figure 1 is a schematic of the display apparatus of an embodiment of the present invention;
Figure 2 is an image of a sample of neurons;
Figure 3 is an initial output image;
Figure 4 shows selection of an origin on a neuron in the image of Figure 2;
Figure 5 shows generation of a spherical trace area around the origin of Figure 4;
Figure 6 shows a transformation into an alternative frame of reference for calculations relating to the sphere of Figure 5;
Figure 7 shows the frame of reference of Figure 6, containing the results of a neighbour fill;
Figure 8 shows the results of the neighbour fill shown in Figure 7, transposed into the output frame of Figure 3;
Figure 9 shows two patch centres resulting from the neighbour fill shown in Figure 7;
Figure 10 shows the determination of three patch centres in an alternate neighbour fill; Figure 11 is a schematic of the process of differentiating between a neuron being traced and a crossing neuron;
Figure 12 shows two valid patch centres on a neuron being traced and rejection of a crossing neuron in an alternate neighbour fill;
Figure 13 is a schematic of the determination of whether or not two patch centres relate to a crossing neuron or branches of the neuron being traced;
Figure 14 is a display showing an output image highlighting the selected neuron.
Description of the Figures
The present invention can also be suitably described with the help of the figures. Figure 1 shows an image processing and display apparatus 5 for tracing and highlighting a selected neuron or neuron path from an image 10 of neurons. The apparatus 5 includes memory 15 for storing input and output image data and any variables used in the processing of the image 10, a processor 20 for processing the image 10 and a display 25, such as a LCD or cathode ray tube monitor, or other output device such as a hard disk, printer, communications link, further memory means and the like, for outputting the processed image 30. In one embodiment, the image processing and display apparatus 5 contains means for receiving an image such as a communications link, disk drive or memory stick/card/chip/device reader. In this embodiment the apparatus may form a stand-alone station for post processing an image obtained from a separate image collection apparatus. In another embodiment, the image processing and display apparatus may be integral with the image collection device. This allows real time imaging and tracing of neurons, which affords greater opportunity to adjust image collection parameters, viewing conditions, imaging angles, equipment setup etc. in order to best extract the required information during the imaging session.
Figure 2 shows an input image 10 of a sample of neuron cells 35, 40. The image may be an image produced by any neuron image collection means known in the art such as a fluorescence microscope. The image is either digital or digitised such that it is pixelated. The image shows a 2D x,y plane slice of a 3D image stack formed by many such images in the z- plane. For clarity, only an x, y plane image is shown. The input image 10 may be considered as an array of pixels, with each pixel being assigned a value in accordance to its brightness or luminance. For processing purposes, the image stack may be represented mathematically as a 3D matrix of data, the data representing for example, the coordinates of pixels, the brightness or luminescence of pixels or colour of pixels. A person skilled in the art would understand the various methods for representing images mathematically for processing purposes and also of reconstructing an output image from a matrix of data once the data is processed. For instance, the degree of darkening (D) could be taken to be 256/number of images and each image going away from the viewer could be a "D" unit darker than the previous one.
The image 10 is processed to determine which pixels represent neurons 35, 40 and which do not. In one embodiment of the present invention, this involves first calculating a pixel noise value for each pixel in the image having an intensity/brightness above a certain limit.
Determining the noise value for a pixel involves defining a 3x3x3 pixel cube whose central pixel is the pixel whose noise value is to be determined. The pixel noise value is calculated by determining the standard deviation of the intensity of the 27 pixels in the contact cube and dividing it by the absolute intensity of the pixel under investigation. Hence, these steps provide a rate of change of intensity. Moreover, within a neuron the intensity is fairly uniform. Outside a neuron, the intensity is also fairly uniform. However, at the edge the standard deviation becomes important (briefly) and the place where this occurs marks the edge. Thus, the pixel noise is used to determine which pixels lie on the edges 45 of a neuron 35, 40. The system assigns pixels having a high pixel noise (above an assigned pixel noise value) as edge pixels and those 50 having a low pixel noise as non-edge. Providing the initial start position lies within a neuron then the observed high variation boundary will mark the edge of the neuron. A second filter may be applied to check the local absolute pixel intensity does not drop below a given threshold to reduce the danger of escaping from a "dark" neuron into the surrounding area. The value of pixel noise above which a pixel is determined as being on the edge 45 of a neuron is selectable by a user and depends on a variety of factors such as the overall intensity of the image 10, the type of cells being imaged, the image collection conditions and apparatus used. To minimise the danger of optical imperfections resulting in a non-cell region being traced, a further intensity threshold is set that will define an edge 45, even if the pixel noise threshold is not reached. Once the edges of the cell are traced, the pixels 50 that were determined to be within the cell are assigned a status variable of "true" and those pixels 55 out with a cell are assigned a status variable of "false".
An output image or image stack 30 is then created that has the same dimensions as the input image or image stack 10. The output image or image stack 30 is initially blank, as shown in Figure 3.
At the beginning, the user will be prompted to select a start pixel 60 on the input image 10 corresponding to a point on the neuron 35 that the user wishes to trace, as shown in Figure 4.
A spherical trace area 65 of predefined radius r having the start pixel 60 at its origin is determined, as illustrated in Figure 5. The radius r may be set by the user, or determined automatically, and may be twice the local radius of the neuronal process being traced.
However, to calculate every pixel on the surface of a sphere is computationally expensive. Hence, an alternative embodiment of the methods of the invention makes use of local coordinates. Once local coordinates have been calculated, this information can be mapped to the region being traced using addition/subtraction rather than trigonometry. This embodiment will indicate whether a pixel lies less than "r" pixels from the trace centre or not. This embodiment is hence much faster than calculating the coordinates of a spherical shell for every trace area but gives a similar answer. Hence, in order to provide quicker calculations, a bounding frame 70 having opposing corners defined as (0, 0, 0) and (2r, 2r, 2r), as shown in Figure 6, is calculated to enclose the spherical trace area 65. The global coordinates (with respect to the input image 10) of the pixels within the boundary frame 70 are converted to local coordinates with respect to the boundary frame 70, i.e. the frame of reference has been transposed.
Alternatively or additionally, in the methods of the invention, the angles of the trace can be pre-calculated. These angles can be used to determine where the lines go, hence reducing the number of trigonometric calculations used in the tracing loop. Hence, to further speed up the calculation process, the pixels lying on the surface of the spherical trace area 65 and the angles they make to the centre 60 of the spherical trace area 65 can be determined at this point. As shown in Figure 7, a neighbour fill is then carried out starting from the start pixel 60. This involves checking the status variable for each of the pixels neighbouring the start pixel. Those neighbouring pixels having a status value of "true" are stored in a boundary frame data class. The process is repeated for the neighbours of each of the first set of neighbouring pixels 50 whose status value is "true" until all the "true" pixels 50 have been discovered or the boundary 75 of the trace area 65 has been reached.
Once the first trace area 65 has been processed, the pixels 50 from the trace area 65 having a status value of "true" are transformed from local coordinates in the bounding frame 70 to coordinates of the output image 30 and transferred to the output image 30, as shown in Figure 8. A GUID value is generated for the trace area 65 and assigned to each of the pixels 50 within the trace area having a status value of "true". The GUID (Globally Unique Identifier) value may be one of 2128 combinations (128-bit number).
Once the neighbourhood fill for a given trace area 65 has been performed, the next area of the image 10 that the tracing process is to be applied is determined. As illustrated in Figure 9, those pixels 50 having a status value of "true" and which lie on the surface of the trace area 65 can be grouped together into patches 80, a patch 80 being a collection of pixels 55 representing a neuron 35 (i.e. having status = "true") and lying on a trace area 65 boundary 75 in which every pixel in the patch 80 contacts at least one other pixel in the patch 80. The centre 80c of each patch 80 is then determined. This is determined by finding the median value for the X, Y and Z coordinates in the patch 80.
For the first trace area 65, each of the patch centres 80, 80' is considered to be a new start pixel 60. For each of the new start pixels 60, new trace areas 65 having the new start pixel 60 at the centre of the trace area 65 are determined. For subsequent trace areas 65 it must be determined whether each patch belongs to the neuron 35 being traced or a differing neuron 40.
Figure 10 shows the determination of a trace vector 95, which indicates a direction of movement of the tracing process. The trace vector 95 (in the case of second and subsequent trace areas) is obtained by determining which of the patches 85 on a trace area has a GUID the same as that of the new start point 60 (i.e. was traced in the same trace area 65 as the start point 60). The centre 85c of the patch 85 having the same GUID as the new start point 60 (i.e. the centre of the trace area 65) is termed the origin. The vector from the origin 85c to the start point 65 (origin) of the trace area 65 is the trace vector 95.
The next stage is to determine whether or not each patch centre 80, 90, 90' belongs to the neuron being traced 35 or to a different neuron 40. This process is illustrated in Figure 11-13. If there is a patch centre 80 substantially opposite the start pixel 60 from the origin 85c, i.e. the angle from the origin 80c to the centre point of the trace area (start pixel 60) matches or is within, for example, ±0.1 radians of the angle from the centre point (start pixel 60) to another patch centre 80c as shown in Figure 12, then only the patch centre 80c opposite the origin 85c is used as the new start pixel 60, any other patch centres 90, 90' for that trace area 65 are ignored and tracing continues accordingly.
If there is no patch centre opposite the origin but there is only one patch centre 80 other than that of the origin 85c, then as above, this patch centre 80 becomes the new start pixel 60 and tracing continues based up this.
If there is more than one patch 100, 100' other than that of the origin 85c, as shown in Figure 13, then a determination must be made if the other patches 100, 100' belong to the neuron being traced 35 or to another neuron 40. In order to do this, the system ignores non-origin patches 100 where the patch centre 100c, another non-origin patch centre 100c' and two child patch centres 105c and 105c' substantially lie on a straight line. A child patch centre 105c, 105c' is a patch centre that lies on a sphere 110 having radius of 2r around the start pixel 60 of the original trace area 65.
Whether or not this exclusion applies can be determined by interpolating a first line 120 through the two non-origin patch centres 100c, lOOc'and interpolating a second line 125 through a child patch centre 105c, 105c' for each of the parent patch centres 100, 100' respectively. If the perpendicular separation 130 of these two lines 120, 125 is greater than 20% of r, then the patches 100, 100'represent a valid bifurcation of the neuron 35 and both patch centres 100c and 100c' become new start pixels 60. If the separation of the two lines 120, 125 is less than 20% of r, then both patches 100 and 100' are designated as belonging to another neuron 40 and ignored. In other words, there are two spheres both with the same centre. One (small) has a radius "r" the other (big) "2r". The object, e.g. neuron, is traced from the middle of the small sphere down the paths that either do not have GUIDs set or different GUIDs from the centre. When the trace reaches the edge of the small sphere new patch centres are calculated. For each of these the trace continues until the edge of the big sphere is reached. Since the spheres are quite small, there is not much space for it to split between the edge of the small and the big sphere but if it did then each of the big sphere patches descended from a single small sphere patch would be used to calculate the angles. If it passes the filters then this split will be investigated in the next cycle of tracing
In alternate embodiment, whether or not the straight line exclusion applies can be determined by calculating an angle (a) between a first patch centre 100c, the origin 60 and a child patch centre 105c corresponding to the first patch centre 100c; an angle (a') between a second patch centre 100c', the origin 60 and a child patch centre 105c' corresponding to the second patch centre 100c'; an angle (b) between the first 100c and second 105c' patch centres and the origin 60 and an angle (c) between the child centres 105c, 105c'. If a is within ±0.1 radians of a' and c is within ±0.1 radians of the sum of a + a' + b then the first and second patches 100, 100' are determined to correspond to a different neuron 40 crossing the neuron being traced 35. In this case, the first and second patches 100, 100' are ignored. However, if a is out with 0.1 radians of a' or c is out with 0.1 radians of the sum of a + a' + b then both patch centres 100 and 100' are assigned as new start points 60 and the tracing process continues down these branches as described above.
The tracing process, including determining new trace areas 65 for each valid patch centre 85c and performing neighbour fills of each trace area 65, is repeated until the trace areas 65 of all the valid patch centres 80c have been traced and the appropriate pixels having a status value of "true" have been transferred to the output image 30. In this way, the output image 30 shows only the selected neuron 35. Any other neurons 40, including those that cross the selected neuron 35 are filtered out from the original image 10. In an optional embodiment shown in Figure 14, the colour of pixels associated with the output image 30 may be changed and the output image 30 superimposed upon the original image 10 in order to highlight the traced neuron 35 over the other neurons 40 in the original image 10.
A skilled person will appreciate that variations of the disclosed arrangements are possible without departing from the invention. For example, whilst the above invention is described in relation to identifying and tracing neurons 35, a skilled person will appreciate that the above method may also be applied to other objects. Although the above specific description advantageously describes the tracing of neurons 35 and neuron pathways, the above method and apparatus may also be adapted for use in the automated counting of objects in which overlapping objects need to be counted as two objects rather than one. An example of this is the counting of a sample of worms such as C. elegans.
In an alternative example, the above method could also be used to extract objects, e.g. trees, or people from movie sequences by replacing the z-stack information of the 3D neuron images by a series of images in the time domain. The edge detection and cross point detection techniques described above may be used to identify a person or object, e.g. tree, from a complex background. The defined person or object may then be highlighted or removed from the movie sequence or subjected to other processing.
As is apparent to one of ordinary skill in the art, variations in the above-described methods can be introduced with ease to attain the same objective. Various incubating conditions, labels, apparatus and materials can be chosen according to individual preference. All publications referred to herein are incorporated by reference in their entirety as if each were referred to individually.

Claims

Claims
1. A method for the identification and the tracing of individual objects from a pixelated image containing at least one object said method comprising propagating a trace from a pixel of origin situated within the edges of the object to be identified and traced, or from a last traced pixel, by determining if pixels neighbouring said pixel of origin, or last traced pixel, are also part of the object to be identified and traced, and propagating the trace to neighbouring pixels found to be part of the object to be identified and traced, wherein determining if a neighbouring pixel is part of the object to be identified and traced is performed by
(a) constructing an area around said pixel of origin, or last traced pixel, said area comprising at least one additional pixel in every direction,
(b) constructing an area around each of the untraced pixels neighbouring said pixel of origin, or last traced pixel, (neighbouring pixel) said area comprising the same number of additional pixels in every direction as in the area constructed in step (a),
(c) calculating a respective standard deviation of the intensity of all the pixels of the area for each of the areas constructed in steps (a) and (b), optionally normalising said standard deviation, preferably by dividing said standard deviation by the intensity of the pixel of origin, or last traced pixel, or neighbouring pixel, respectively,
wherein a significant increase, or a value above a pre-defined threshold, of the standard deviation calculated for the pixels of the area constructed in step (b) around a particular neighbouring pixel as compared to the standard deviation calculated for the pixels of the area constructed in step (a) around said pixel of origin, or last traced pixel, indicates that said particular neighbouring pixel is part of the edge of the object to be identified and traced and is therefore not part of the object to be identified and traced, whereas a similar value, or a value below a pre-defined threshold, for the standard deviation calculated for the pixels of the area constructed in step (b) around a particular neighbouring pixel as compared to the standard deviation calculated for the pixels of the area constructed in step (a) around said pixel of origin, or last traced pixel, indicates that said particular neighbouring pixel is part of the object to be identified and traced,
(d) propagating the trace to the untraced neighbouring pixels which have been determined in step (c) to be part of the object to be identified and traced, and
(e) repeating steps (a) to (d) for each of the pixels traced in step (d) until there are no untraced neighbouring pixels identified as part of the object to be identified and traced and/or until the boundary of another already traced area has been reached.
2. A method for the identification and the tracing of individual objects from a pixelated image containing at least one object said method comprising propagating a trace from a pixel of origin situated within the edges of the object to be identified and traced, or from a last traced pixel, by determining if pixels neighbouring said pixel of origin, or last traced pixel, are also part of the object to be identified and traced, and propagating the trace to neighbouring pixels found to be part of the object to be identified and traced, wherein determining if a neighbouring pixel is part of the object to be identified and traced is performed by
(a) constructing an area around each of the untraced pixels neighbouring said pixel of origin, or last traced pixel, (neighbouring pixel), said area comprising at least one additional pixel in every direction,
(b) calculating a respective standard deviation value of the intensity of all the pixels of the area constructed in step (a) around each of the untraced neighbouring pixels,
(c) normalising each of the standard deviation value calculated in step (b), by dividing said standard deviation value by the intensity of the untraced neighbouring pixel around which the area has been constructed in step (a), hence obtaining a normalised standard deviation value for each of the untraced neighbouring pixels,
wherein a normalised standard deviation value above a pre-defined threshold indicates that the untraced neighbouring pixel around which the area has been constructed in step (a) is part of the edge of the object to be identified and traced and is therefore not part of the object to be identified and traced, whereas a normalised standard deviation value below or equal to a pre-defined threshold indicates that said particular untraced neighbouring pixel around which the area has been constructed in step (a) is part of the object to be identified and traced,
(d) propagating the trace to the untraced neighbouring pixels which have been determined in step (c) to be part of the object to be identified and traced, and
(e) repeating steps (a) to (d) for each of the pixels traced in step (d) until there are no untraced neighbouring pixels identified as part of the object to be identified and traced and/or until the boundary of another already traced area has been reached.
3. The method of claim 1 or 2 wherein the object to be identified and traced is contained in a stack of at least three pixelated images and wherein the areas of steps (a) to (c) are replaced by volumes.
4. The method of any of claims 1 to 3 wherein the area is a square of 3x3 pixels or the volume is a cube of 3x3x3 pixels, respectively.
5. A method for the identification and the tracing of individual, possibly bifurcating, objects from a pixelated image containing at least one possibly bifurcating object, and possibly containing multiple overlaid, possibly bifurcating objects, said method comprising the steps of
(i) defining a first circle having the pixel of origin as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object, and wherein said pixel of origin is situated within the edges of the object to be identified and traced,
(ii) propagating a trace from the centre of the first circle to neighbouring pixels according to the method of any of claims 1 to 4 until the trace reaches the perimeter of said first circle and until all the pixels which are part of the object to be identified and traced and within the surface of said first circle have been traced, wherein all the pixels which are part of the object to be identified and traced and are also part of the perimeter of said first circle are considered to be a patch,
(iii) attributing an identical identifier to all the pixels traced in step (ii) not yet having an identifier, thereby producing identified pixels,
(iv) calculating the coordinates of the integer median pixel of each patch present on the perimeter of the first circle,
(v) assessing the number of patches or of integer median pixels present on the perimeter of the first circle, wherein
(vi) in the cases where there are only one, two or three patches, or integer median pixels, defining a new circle having the respective integer median pixel of origin as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object, for each of the integer median pixels which have been identified in step (iii), or
(vi') in the cases where there are more than three patches, or integer median pixels, defining a new circle having the pixel of origin as centre and a radius of R, wherein R is not equal to r, wherein R is bigger than the local radius of the possibly bifurcating object, and wherein R is chosen so that after repeating steps (ii) and (iv), thus obtaining for each of the previous patches or integer median pixels an alternative patch or integer median pixel present on the perimeter of said circle having a radius of R, the same amount of alternative patches or integer median pixels is present as compared to the number of patches obtained in step (iii) or integer median pixels calculated in step (iv), respectively, assessing whether each pair of integer median pixels and alternative integer median pixels has a counterpart lying on a straight line, wherein a straight-line is defined as when the sum of the angles in radians of the triangles formed by the centre of the circles and the integer median pixels and alternative integer median pixels is between or equal to π-x and π, where x is a predefined threshold, for example 10% π, or as when the line joining an integer median pixel and its corresponding alternative integer median pixel when extended in both direction until it traverses on the other side of the circle having a radius of R has a square of the distance between itself and a similar line originating form another integer median pixel and the alternative integer median pixel from said other integer median pixel that is below a pre-defined threshold and
(1 ) wherein any unidentified integer median pixel lying on a straight-line with regard to the integer median pixel identified in step (iii) is a new start point for defining a new circle having said new start point as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object,
(2) wherein any unidentified integer median pixel lying on a straight-line with regard to another unidentified integer median pixel is rejected and therefore is not a new start point, and
(3) wherein any unidentified integer median pixel which has not been rejected under step (2) is also considered to be a new start point for defining a new circle having said new start point as centre and a radius of r, wherein r is bigger than the local radius of the possibly bifurcating object,
and
(vii) repeating steps (ii) to (vii), wherein in steps (ii) to (v) the new circles having a radius of r are used instead of the first circle, and the integer median pixels are used instead of the pixel of origin.
6. The method of claim 5 wherein the object to be identified and traced is contained in a stack of at least three pixelated images and wherein the circles and perimeters thereof of steps (i) to (vii) are replaced by spheres and surfaces thereof.
7. The method of any of claims 5 or 6 wherein at step (i) the radius r is set to be equal to or greater than the mean local edge distance from the centre of the circle or sphere, respectively, when following the shortest one-dimensional path to the edge, or the mean distance from the respective integer median pixel to the patch edge.
8. The method of any of claims 5 to 7 wherein at step (i) the radius r is set to be twice the mean local edge distance from the centre of the circle or sphere, respectively.
9. A computer program comprising code means which when executed on a computer perform all the steps of the method of any of claims 1 to 8.
10. A computer-readable medium comprising the computer program of claim 9 embedded or recorded thereon.
PCT/EP2008/066584 2007-12-03 2008-12-02 Improved display of multiple overlaid objects WO2009071526A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP07122126.1 2007-12-03
EP07122126 2007-12-03

Publications (1)

Publication Number Publication Date
WO2009071526A1 true WO2009071526A1 (en) 2009-06-11

Family

ID=39312961

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/066584 WO2009071526A1 (en) 2007-12-03 2008-12-02 Improved display of multiple overlaid objects

Country Status (1)

Country Link
WO (1) WO2009071526A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8285034B2 (en) 2009-08-26 2012-10-09 Bally Gaming, Inc. Apparatus, method and article for evaluating a stack of objects in an image
US8920236B2 (en) 2007-11-02 2014-12-30 Bally Gaming, Inc. Game related systems, methods, and articles that combine virtual and physical elements

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
GYU-DONG LEE ET AL: "Rough edge detection of low contrast images using consequential local variance maxima", TENCON 99. PROCEEDINGS OF THE IEEE REGION 10 CONFERENCE CHEJU ISLAND, SOUTH KOREA 15-17 SEPT. 1999, PISCATAWAY, NJ, USA,IEEE, US, vol. 1, 15 September 1999 (1999-09-15), pages 734 - 737, XP010368357, ISBN: 0-7803-5739-6 *
HARALICK R M ET AL: "IMAGE SEGMENTATION TECHNIQUES", COMPUTER VISION GRAPHICS AND IMAGE PROCESSING, ACADEMIC PRESS, DULUTH, MA, US, vol. 29, no. 1, 1985, pages 100 - 132, XP000974960 *
KAI ZHANG ET AL: "A 3D Self-Adjust Region Growing Method for Axon Extraction", IMAGE PROCESSING, 2007. ICIP 2007. IEEE INTERNATIONAL CONFERENCE ON, IEEE, PI, September 2007 (2007-09-01), pages II - 433, XP031157954, ISBN: 978-1-4244-1436-9 *
LIU XINCHUN ET AL: "Edge-Detection Based on the Local Variance in Angiographic Images", JOURNAL OF ELECTRONICS, vol. 7, no. 4, October 2000 (2000-10-01), pages 338 - 344, XP002478144 *
RANGA SRINIVASAN ET AL: "3-D CENTERLINE EXTRACTION OF AXONS IN MICROSCOPIC STACKS FOR THE STUDY OF MOTOR NEURON BEHAVIOR IN DEVELOPING MUSCLES", BIOMEDICAL IMAGING: FROM NANO TO MACRO, 2007. ISBI 2007. 4TH IEEE INTERNATIONAL SYMPOSIUM ON, IEEE, PI, April 2007 (2007-04-01), pages 93 - 96, XP031084218, ISBN: 1-4244-0671-4 *
TAMEZ-PENA J G ET AL: "The integration of automatic segmentation and motion tracking for 4D reconstruction and visualization of musculoskeletal structures", BIOMEDICAL IMAGE ANALYSIS, 1998. PROCEEDINGS. WORKSHOP ON SANTA BARBARA, CA, USA 26-27 JUNE 1998, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 26 June 1998 (1998-06-26), pages 154 - 163, XP010291418, ISBN: 0-8186-8460-7 *
YONG ZHANG ET AL: "3-D axon structure extraction and analysis in confocal fluorescence microscopy images", LIFE SCIENCE SYSTEMS AND APPLICATIONS WORKSHOP, 2007. LISA 2007. IEEE/NIH, IEEE, PI, November 2007 (2007-11-01), pages 241 - 244, XP031189226, ISBN: 978-1-4244-1812-1 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8920236B2 (en) 2007-11-02 2014-12-30 Bally Gaming, Inc. Game related systems, methods, and articles that combine virtual and physical elements
US9613487B2 (en) 2007-11-02 2017-04-04 Bally Gaming, Inc. Game related systems, methods, and articles that combine virtual and physical elements
US8285034B2 (en) 2009-08-26 2012-10-09 Bally Gaming, Inc. Apparatus, method and article for evaluating a stack of objects in an image
US8606002B2 (en) 2009-08-26 2013-12-10 Bally Gaming, Inc. Apparatus, method and article for evaluating a stack of objects in an image

Similar Documents

Publication Publication Date Title
US9773302B2 (en) Three-dimensional object model tagging
Sohn et al. Predicting visual discomfort using object size and disparity information in stereoscopic images
EP3351001B1 (en) Method for encoding a light field content
RU2008131769A (en) METHOD AND INSTALLATION FOR IDENTIFICATION OF MATERIALS USING RADIOGRAPHIC IMAGES OF BINOCULAR STEREOSCOPY OBTAINED FOR VARIOUS RADIATION ENERGIES
CN102812715A (en) Three-dimensional imaging device and three-dimensional imaging method
Jung et al. Depth sensation enhancement using the just noticeable depth difference
CN104182952B (en) Multi-focus sequence image fusion method
CN101180658A (en) Depth perception
CN111524100A (en) Defect image sample generation method and device and panel defect detection method
CN103415869A (en) Method of detecting and quantifying blur in a digital image
JP2023172882A (en) Three-dimensional representation method and representation apparatus
CN104184936B (en) Image focusing processing method and system based on light field camera
CN113052754B (en) Method and device for blurring picture background
WO2009071526A1 (en) Improved display of multiple overlaid objects
CN109902751A (en) A kind of dial digital character identifying method merging convolutional neural networks and half-word template matching
CN112489103B (en) High-resolution depth map acquisition method and system
CN109087344A (en) Image-selecting method and device in three-dimensional reconstruction
Akhyar et al. A beneficial dual transformation approach for deep learning networks used in steel surface defect detection
CN105872516A (en) Method and device for obtaining parallax parameters of three-dimensional film source
CN114879377B (en) Parameter determination method, device and equipment of horizontal parallax three-dimensional light field display system
Faluvégi et al. A 3D convolutional neural network for light field depth estimation
KR20020020941A (en) Method, System and Apparatus
Danahy et al. Directional edge detection using the logical transform for binary and grayscale images
US10679354B2 (en) Method for the graphics processing of images
CN110992474A (en) Method for realizing time domain technology

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08857550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08857550

Country of ref document: EP

Kind code of ref document: A1