WO1998047095A1 - Specimen evaluation device - Google Patents

Specimen evaluation device Download PDF

Info

Publication number
WO1998047095A1
WO1998047095A1 PCT/US1998/007100 US9807100W WO9847095A1 WO 1998047095 A1 WO1998047095 A1 WO 1998047095A1 US 9807100 W US9807100 W US 9807100W WO 9847095 A1 WO9847095 A1 WO 9847095A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature map
specimen
slide
image
images
Prior art date
Application number
PCT/US1998/007100
Other languages
French (fr)
Inventor
Eunice K. Lin
Kirill A. Khazan
Lila Poritsky
Original Assignee
Neuromedical Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neuromedical Systems, Inc. filed Critical Neuromedical Systems, Inc.
Priority to AU71060/98A priority Critical patent/AU7106098A/en
Publication of WO1998047095A1 publication Critical patent/WO1998047095A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts

Definitions

  • the review station 14 preferably allows a user to place the specimen 20 from which the images 1 8 are being reviewed on the stage of the microscope system 24 and concurrently links the processor 26 to the data stored by the classification device 48 for that specimen 20.
  • a user may then, for example using the mouse 28, direct the processor 26 to display an array of images 1 8 for review as well as the feature map 22 of the scanned specimen 20 showing the cover slip boundary 33, the biological matter found on the slide 36, the air bubbles 38 and the location of the images displayed on the monitor for review.
  • the location of the cell image 44 in the feature map 22 may additionally be highlighted to allow the user readily to determine the relative position of the cell image 1 8 in the specimen 20. If the user finds a particular cell warranting additional direct review through a microscope or by a supervisor or other person, the user can inform the processor 26 to additionally color code that location on the feature map, such as by changing the color code from yellow to red.

Abstract

A method of displaying a user-defined mapping of a medical specimen (20) includes creating a feature map (22) of the medical specimen (20). User-defined locations are then mapped onto the feature map (22) to create a custom slide which is then displayed. A system (10) for displaying user-defined mapping of a medical specimen (20) includes a processor (26) for receiving the image and feature map data. A display (16) is coupled to the processor (26) for displaying selected visual images (18) and the feature map (22). A mouse (28) is coupled to the processor (26) to identify particular images (18) on the display (16). The processor (26), in response, effectuates a locational indication of the identified image (18) on the displayed feature map (22) by mapping the location and nature of the image onto the displayed feature map (22).

Description

Title: SPECIMEN EVALUATION DEVICE
TECHNICAL FIELD OF THE INVENTION This invention relates generally to the display of specimens for evaluation, and more particularly to the generation and display of a map of the specimen for use in the evaluation of biological specimens.
BACKGROUND OF THE INVENTION In the medical industry it is often useful to review a cytological or histological specimen for the presence of cells of a certain cellular type. One example of a cytological specimen requiring such review is a cervical or Pap smear which is typically reviewed by a user for the presence of certain cells, such as malignant or premalignant cells. A Pap smear often contains as many as 100,000 to 200,000 or more cells and other objects, each of which must be individually inspected in order to determine the possible presence of very few malignant or premalignant cells. There are also many other instances in which it is necessary or desirable to inspect hundreds or thousands of objects or areas of an object to ensure that a specimen is adequate or that a device, such as a silicon wafer, has been properly manufactured.
Where the review of a specimen or an object involves the inspection of several or hundreds of cells or objects, it is desirable to facilitate the inspection by providing images of the cells or objects on a display. One system which has successfully accomplished an enhanced review of a cytological specimen by selecting cells for display on a monitor and inspection by a cytotechnologist is the PAPNET Testing System, manufactured by Neuromedical Systems, Inc. of Suffern New York.
In the PAPNET Testing System an array of cellular images taken from a
Pap smear are displayed on a monitor. A user (sometimes referred to as a cytotechnician) can review the images and determine if the Pap smear specimen warrants further inspection under a microscope. Sometimes such determining may be referred to as classifying or evaluating the specimen and/or the objects contained in the specimen. It would be desirable in a review system, such as the PAPNET Testing System, or another biological review system or in a device manufacturing inspection system to view the slide concurrently with the images, because contextual information is sometimes relevant and it would facilitate the inspection of other images in a specimen or on a device, to provide a means for identification of the locations of certain cells or objects in a specimen or on a device.
SUMMARY OF THE INVENTION The present invention includes a method and system for developing and displaying a map of cells, areas or objects of interest in a specimen or on a device. The invention includes providing a graphical display of the specimen or object being reviewed with areas of interest overlaid on the display. The graphical display indicates the presence of and allows for the differentiation between, for example, areas of biological matter in general on a Pap smear, air bubbles in the smear, cells displayed for inspection and cells determined through the inspection to warrant further review through a microscope. The areas may be identified and differentiated through a variety of methods including representing different categories or areas of interest with different colors. In accordance with another aspect of the invention, a system for displaying a user defined mapping of a medical specimen includes a processor for receiving the data which represents the medical specimen and a display which is coupled to the processor for displaying selected visual images and a feature map. The selected visual images are representative of portions of the medical specimen and the feature map is an image of the entire specimen on the slide. An input device is coupled to the processor and is operable to send a signal to the processor which is indicative of identified images and the processor in turn effectuates a locational indication of the identified image on the feature map. In another aspect of the invention, a method of mapping user-defined locations from a plurality of display images onto a feature map is disclosed. The method includes selecting at least one image from the plurality of display images, identifying the location(s) of the image(s) on the slide, B.g., by information representing coordinates of the microscope stage, sometimes referred to as slide coordinates, associated with the selected image(s), converting the slide coordinates to feature map coordinates, and modifying the feature map data to indicate a selected image of interest.
In yet another aspect of the invention, a method of mapping a location of a viewing apparatus with respect to a viewing specimen onto a feature map is disclosed. The method includes the steps of identifying the positional control indices of the viewing apparatus, calculating the slide coordinates that correspond to the positional control indices, converting the slide coordinates to feature map coordinates, and modifying the feature map data to indicate the location of the viewing apparatus with respect to the slide.
These and other objects, advantages, features and aspects of the present invention will become apparent as the following description proceeds.
To the accomplishments of the foregoing and related ends, the invention, then comprises the features hereinafter fully described in the specification and particularly pointed out in the claims. The following description and the annexed drawings set forth in detail a certain illustrative embodiment of the invention. This embodiment, however, is but one of various ways in which principles of the invention may be employed. It will be appreciated that the scope of the invention is to be determined by the claims and the equivalents thereof.
BRIEF DESCRIPTION OF THE DRAWINGS In the annexed drawings:
Figure 1 is a system level diagram illustrating a system for generating and displaying user-defined feature maps;
Figure 2 is a pictorial diagram illustrating selected tiles of interest and a feature map on a display; Figure 2A is a color image depicting the feature map portion of Figure 2 and illustrating an example of use of color in the feature map; Figure 3 is a block diagram illustrating the system components for generating and displaying user-defined feature maps;
Figure 4 is a flowchart diagram illustrating the function of the invention, namely the generation and display of a user-defined feature map; Figure 5 is a pictorial diagram illustrating a plurality of viewing tiles containing cell images on a display;
Figure 6 is a, pictorial diagram illustrating a feature map display; Figure 7 is a block level diagram illustrating the different manners in which user-defined flags are mapped onto the feature map; Figure 8 is a flowchart diagram illustrating the steps taken in mapping a user-defined flag onto the feature map when either a present tile is being reviewed or when several tiles of interest have been tagged by a reviewer; and Figure 9 is a flowchart diagram illustrating the steps taken in mapping a user-defined flag onto the feature map when communicating the present stage location of a viewing microscope.
DETAILED DESCRIPTION OF THE INVENTION
With reference to the drawings and initially to Figure 1 , there is shown a specimen mapping system 10 in accordance with the invention employed generally in the environment of an exemplary specimen classification system 12 and particularly with the components directed to a specimen review station 14.
The exemplary classification system is a semi-automated screener for screening cervical or Pap smears for the presence of cellular abnormalities which may indicate a cancerous or precancerous condition, although it will be appreciated that the specimen mapping system 10 of the present invention will have equal applicability to other automated or semi-automated classification devices.
The review station 14 of the specimen classification system 1 2 includes a display monitor 1 6 for the display of images 1 8 of, in the exemplary embodiment, cells taken from a specimen 20 to be classified and a visual feature (or slide) map 22 of areas or features of interest in the specimen 20. The specimen review station 1 4 further includes an automated microscope system 24 having microscope optics 25 for obtaining a direct view of a desired area of the specimen 20, such as to review certain cells from the display 1 6 directly through the microscope system 24, and a processor 26 for controlling the various components of the reviewing station 14, for performing image processing functions and for interacting with a user such as a cytotechnician, and a mouse 28 or similar input device facilitating the transfer of information from the user to the processor 26.
As shown more clearly in Figure 2, the images of cells 1 8 displayed on the display monitor 1 6 may be arranged in an array of individual cells and the their contextual surroundings for inspection and classification by a user and may share the display 16 with the feature map 22 of the specimen 20. The feature map 22 preferably generally resembles the specimen 20 being classified and the carrier of the specimen 20, in this instance a slide 30 upon which the smear specimen 20 has been deposited, and provides a visual indication of information relevant to the classification process for the specimen 20. For example, the feature map 22 may indicate a barcode label 31 and a slide section 32, the boundary of the slide cover slip 33 which delineates an exterior area 34 and a cover slip area 35, the areas 36 under the cover slip 33 found to include biological matter (also called the specimen area), and the areas 38 within the area under the cover slip 33 found to represent air bubbles. The feature map 22 may also include the locations 40 of cells determined to possibly or most likely be suspicious by a classification process performed by the processor 26 (also called the highest ranked image markers), the locations 42 of cells determined by the user as warranting further review, such as through the microscope system 24, the area 44 currently being reviewed by the user on the display 1 6, and the area 45 currently in the field of view of the microscope system 24 (also called the microscope location).
The different features in the feature map 22 are preferably also distinguishable from one another such as by color coding. In such an instance, the areas of biological matter 36 may appear blue, the air bubbles 38 may appear white, the location of cells 40 found by the processor 26 may appear yellow, the cells warranting further review 42 as determined by the user may appear red, the cells currently being reviewed 44 on the display 1 6 may appear green and the area in the field of view of the microscope system 22 may appear pink, for example. An exemplary color image showing several, but not all of such colors, is presented in Figure 2A. Other techniques may be used to identify or to represent respective areas instead of or in addition to colors; examples include shading, outlining, use of respective symbols, blinking or flashing, and others.
Consequently, a feature map 22 results which is user defined with the information most valuable to the classification or inspection process for a specific specimen 20. As the relative positions of the cells in the specimen 20 may sometimes be relevant to the individual classification of a cell or of the specimen as a whole, the feature map 22 provides a valuable asset in allowing a user to see where the cells appearing in the display area of the display monitor 1 6 are found in the specimen 20 as well as where a particular cell being viewed through the microscope system 24 resides in the specimen 20 relative to other cells found to be of interest and flagged in the feature map 22.
Turning to a more detailed description of the environment in which the feature mapping system 1 0 in accordance with the present invention may be used, and with further reference to Figure 3, the specimen classification system 12 is seen as preferably including, in addition to the review station 14 illustrated in Figure 1 , a scanning device 46 and a classification device 48 which together create a non user-defined portion of the feature map 22, e.g., as may be defined by the system itself. The scanning device 46 scans the specimen slide 30 and determines the boundaries of the cover slip 33 on the slide, air bubbles 38 trapped under the cover slip 33, the locations of biological material 36 under the cover slip 33 for review and analysis in the classification process and/or other areas to be isolated from further review in the classification process performed by the classification device 48. The scanning device 46 preferably includes optics for obtaining a magnified view of the specimen 20, a camera and other components for creating one or more digital representations of the specimen 20 suitable for processing and a processor for defining a scan map of areas in the specimen to which further processing will be limited. One suitable scanning device 46 and a method of generating a scan map are described in U.S. Patent Application Serial No. 08/576,988 filed December 1 9, 1995, which is incorporated herein by this reference. The scan map is stored and provided for use by the classification device
48 in directing a further scan of the specimen 20, such as for use in performing a semi-automated classification of the areas of the specimen 20 corresponding to the scan map. Suitable methods and systems for classifying the specimen 20 and determining the areas of interest, such as cells which are most likely to represent pre-malignant or malignant cells among the population of cells in the specimen, are described in U.S. Patent Nos. 4,965,725; 5,257, 182; 5,287,272; and 5,544,650, all of which are incorporated herein by this reference. The classification device 48 preferably includes an automated microscope and camera for generating digital images of the biological matter within the scan map, and one or more processors for performing various image processing and classification operations on the images, including neural network functions, adaptive processing and fuzzy logic operations. The scanning device 46 and the classification device 48 may be embodied as physically separate components or as one device serving both functions. The classification device 48 selects a number of cell images 1 8 for further review on the display 1 6 by the user and stores the cell images 1 8 on storage media along with the scan map. The locations of the cells selected for review 50 (also called the slide coordinates) are also recorded to be displayed as desired in conjunction with the scan map to form the basic features of the feature map 22. The review station 1 4 may be embodied with the scanning device 46 and the classification device 48. If desired, the review station may be a separate component and may, as such, be located geographically distant from the scanning device 46 and the classification device 48. For example, multiple review stations 14 may be resident at different laboratory sites while one or more scanning devices 46 and classification devices 48 may be located at a common center supporting multiple laboratory sites. Data, such as cell images 1 8 and feature maps 22 may be recorded on magnetic or optical media, for example, and shipped along with the specimens 20 analyzed between the review station 14, the scanning device 46 and the classification device 48 or may be transmitted between the components through telephonic or other conventional data links.
The review station 14 preferably allows a user to place the specimen 20 from which the images 1 8 are being reviewed on the stage of the microscope system 24 and concurrently links the processor 26 to the data stored by the classification device 48 for that specimen 20. A user may then, for example using the mouse 28, direct the processor 26 to display an array of images 1 8 for review as well as the feature map 22 of the scanned specimen 20 showing the cover slip boundary 33, the biological matter found on the slide 36, the air bubbles 38 and the location of the images displayed on the monitor for review. As the user directs a cursor through a particular displayed image 1 8, the location of the cell image 44 in the feature map 22 may additionally be highlighted to allow the user readily to determine the relative position of the cell image 1 8 in the specimen 20. If the user finds a particular cell warranting additional direct review through a microscope or by a supervisor or other person, the user can inform the processor 26 to additionally color code that location on the feature map, such as by changing the color code from yellow to red.
In an embodiment the microscope is an automated one. Such automation may be achieved by the processor controlling movement of the stage relative to the optics. In another embodiment the stage is permitted to move and an encoder identifies the relative location of the stage to the optics. The user can also instruct the processor 26 to move the specimen 20 relative to the microscope optics 25 in an automated microscope to bring a certain cell on the display 1 6 into the field of view of the microscope to permit direct viewing of the cell and surrounding matter. The location of the area 45 in the field of view of the microscope can be identified on the feature map 22 such as by making the location appear pink, for example. Other changes or modifications can also be made on the feature map 22 to allow the user easily to discern and/or to record the relative positions of other events of interest using color coding or some other indicator, such as blinking, outlining, etc.
Figure 4 is a flowchart diagram which illustrates the functional steps that the classification device 48 and processor 26 utilize to generate and display the user-defined feature maps 22 for evaluating the medical specimen 20. At step
54, the feature map 22 is created as already described in conjunction with Figure 3 and is subsequently read at step 56 from the tape, disk, CD Rom or other storage device (hereinafter referred to as "tape" for brevity) (upon which the classification device 48 placed the feature map information) into the memory associated with the processor 26 of the reviewing station 14. Because the data stored on the tape includes image files of the specimen 20, slide coordinates of each image of the specimen 20, and feature map data, the processor 26 first opens the data file and reads the file headers to identify the location of the feature map data. After getting the location of the feature map data, the processor 26 gets the size and resolution information of the feature map 22 and reads the feature map data as a sequence of bytes. In an embodiment, the matrix or sequence of bytes is 100x70 bytes, with one byte representing one feature map element. Other matrices, sequences, or number of bytes or elements may be used, as will be appreciated. Each feature map element (byte) represents one medium resolution field of view as seen by the scanning device
46. Each byte therefore contains a flag which indicates its status: an exterior area 34 outside the cover slip 33, the bubble area 38, the area 35 within the cover slip 33, the specimen area 36, or a highest ranked image marker area 40. In an embodiment the scanner may identify images which are intended to be reviewed in the review station. For example, the scanner may identify a prescribed number of images based on ranking processed images. In one example the locations of the 1 28 highest ranked images are identified and may be represented by such image marker areas 40. The processor 26 can then access the feature map information from the memory. The feature map 22 may then be displayed on the display 1 6 with the ranked images 1 8 and a user may, utilizing the mouse 28 and the display 1 6, select various images 1 8 that he or she considers to be images of interest. The images of interest, the present image being evaluated, and the location of the microscope 25 above the slide 30 are the informational components which define the user-defined flags of step 58 in Figure 4. By making these selections, the user is mapping the user-defined flags onto the feature map 22. The processor 26 takes this information and creates an image palette to color code the user-defined flagged areas at step 60 and subsequently converts the feature map 22, which is now a custom feature map, to a standard image format at step 62 and displays the custom feature map 22 with color coding at step 64. The reviewing user may then evaluate, via the display 1 6, both the selected images 1 8 as well as the corresponding user-defined feature map 22.
The mapping of user-defined flags onto the feature map (step 58 of Figure 4) is described in greater detail in conjunction with Figures 5-8. Figure 5 is a pictorial diagram illustrating the display 1 6 of Figure 1 . The display 1 6 of Figure 5 displays a plurality of tiles 74 (also called image tiles) wherein each of the plurality of tiles 74 illustrates an image 1 8 of the cells on the slide 30 which represents one of the identified portions of the original specimen 20, for example, the mentioned 128 highest ranked portions or objects. A reviewer may then view the plurality of tiles 74 on the display 1 6 and, in this particular embodiment, 1 6 of the 1 28 images may be viewed at one time. However, either more or fewer images 18 may be viewed at one time on the display 1 6. Whenever a reviewer locates a particular image that he believes requires additional review he may "tag" one of the tiles 74 and select that tile as a tile of interest 76. Whenever the reviewer selects a tile of interest 76, that particular tile is then surrounded by a first colored border 78 on the display 1 6. The reviewer may also highlight the particular image presently under review on the display 1 6, making that image a selected tile 80. Whenever the reviewer has selected a particular tile for review, the selected tile 80 is shaded with a second colored border 82. Each of the tiles of interest 76 as well as the selected tile 80 will then have slide coordinates 50 displayed in the lower portion of the image 1 8. The slide coordinates 50 represent the locational indices of that particular image on the slide 30.
Figure 6 is a pictorial diagram illustrating in greater detail a feature map 22 as viewed on the display 1 6. The feature map 22 in Figure 6 is of a different specimen than that represented by the feature map illustrated and described in conjunction with Figure 2, and both feature maps display various pieces of information which are pertinent to the user. To reiterate, with respect to the feature map 22 of Figure 6, the barcode label 31 is simply a representation of the actual barcode that is placed on the slide 30 to show the proper orientation of the slide 30. The slide section 32 of the feature map 22 has an exterior area
34 which represents the portion laying outside the cover slip 33. The cover slip area 35 shows the region inside the cover slip 33 and the specimen area 36 illustrates the portion within the cover slip 33 containing the specimen 20 on the slide 30. The highest ranked image markers 40 are also labeled on the feature map 22 and represent the 1 28 highest ranked objects within the specimen area 36. In addition to displaying the above information, the feature map 22 also displays user tags 42, 44 and 45 which are the user-defined portions of the feature map 22. The user tags 42, 44 and 45 are areas of the feature map 22 that represent areas marked for further review 42, images currently being viewed 44 and the present location of the microscope 45 and correspond to the tile of interest markers 76, the selected tile marker 80, and the microscope location marker 84, respectively. The feature map 22 therefore provides a color-coded display that gives the reviewing user a visual locational context in which each image 1 8 is being viewed. The exterior area 34, the cover slip area 35, the specimen area 36, the highest ranked image markers 40, the areas marked for further review 42, the image currently being viewed 44, and the microscope location 45 are preferably selectively color coded so as to provide the reviewing user an easy indication of which markers correspond to the various regions on the feature map 22. In this particular embodiment, the exterior area 34 is assigned a dark gray color, the bubble area 38 is assigned a white color, the cover slip area 35 is assigned a light gray color, the specimen area 36 is assigned a blue color, the highest ranked image markers 40 are assigned a yellow color, the areas marked for further review 42 are assigned a red color, the image currently being viewed 44 is assigned a green color, and the microscope location 45 is assigned a pink color. Other colors or symbols (or outlining, blinking, etc.) may alternatively be used (for brevity the coding will be referred to below as "color coding"). In this manner, the feature map 22 provides the evaluating user a color coded or symbolic representation of the important pieces of information. The feature map 22, in conjunction with the displayed tiles of interest 74 on the display 1 6, provides an additional tool for the user to efficiently and effectively view the identified images using the microscope. The slide map also may be relied on to convey to the user other information concerning the slide, slide preparation and/or the specimen on the slide.
Figure 7 is a block level diagram which illustrates the options involved in step 58 of Figure 4 which involves mapping user-defined flags onto the feature map 22. Mapping the user-defined flags onto the feature map 22 is effectuated by selecting a tile for review 86 (one example is by pointing to this tile by a mouse cursor), selecting (or having selected (sometimes referred to as tagging or tagged) a tile of interest 88 (one example is by clicking a mouse button while pointing to the tile), or communicating the present stage location of the microscope 90 (one example is by using information from position encoders associated with the microscope).
Figure 8 is a flowchart diagram illustrating the steps taken to map the user-defined flags onto the feature map 22 with respect to the selection of a tile for review 86 or the selection of tiles of interest 88. In a first step, step 92, the selected tile 80 (Figure 5) or the tiles of interest 76 are identified. The processor 26 identifies the slide coordinates 50 associated with the identified image or images 1 8 (Figures 1 and 2) at step 94. The processor 26 then converts the slide coordinates 50 to feature map coordinates at step 96. in an exemplary embodiment the feature map 22 consists of 100x70 bytes, or 7,000 pixels. In the exemplary embodiment because the scanning resolution of the images is different from the feature map resolution, the processor 26 must convert the slide coordinates 50 to feature map coordinates via an algorithm that identifies the relative position of the image 1 8 on the slide 30 to place a flag at a pixel on the feature map 22 that has the same relative position. For example, if the slide coordinates 50 of a particular image is (x^y,) and the scanning resolution is sx x sy then the relative position of the image 1 8 on the slide 30 is x /sx by y^Sy. Since the resolution of the feature map 22 is fx by fy, the feature map coordinates may be calculated in one embodiment as follows: X coordinate = (x*-/sx) x fx, and Y coordinate = (y-,/sy) x fy.
In this manner, the processor 26 converts the slide coordinates 50 to feature map coordinates for various scanning resolutions and for various feature map resolutions. The processor 26 then modifies the feature map data by inserting the appropriate flag where called for by the user (user defined portions mentioned above) or by the scanner (non user-defined portions, mentioned above) at step 98. In this manner, both user-defined flags and non user-defined flags are mapped onto the feature map 22.
The scaling of the scanning microscope and that of the reviewing microscope, such as that designated 25 in Figure 1 , may be different. Therefore it may be necessary to convert or to coordinate the scaling thereof.
Figure 9 is a flowchart diagram illustrating the steps taken to map the user-defined flag onto the feature maps 22 for the present stage location of the microscope 25. The first step, step 100, communicates the microscope coordinates to the processor 26. The processor 26, at step 102, then correlates the microscope coordinates to the slide coordinates in one of two ways. When the slide 30 is positionally placed on the stage for delivery under the microscope optics 25, then the microscope system 24 is initialized by moving the microscope optics 25 to a known slide coordinate such as one of the corners of the slide 30. The microscope coordinates are then correlated to the slide coordinates in that manner. If the microscope positional resolution is different than the slide coordinate resolution, then the processor 26 must also convert the microscope coordinates to slide coordinates via an algorithm such as one that is the same as or similar to that described above with respect to step 96 of Figure 8. After converting the microscope coordinates to slide coordinates at step 102, the processor 26 converts the slide coordinates to feature map coordinates at step 104. This conversion is accomplished in the same way as step 96 of Figure 9. Alternatively, the processor 26 may combine the conversions of steps 102 and 1 04 into a single algorithm, thereby converting the microscope coordinates directly into feature map coordinates. The processor 22 then modifies the feature map data by inserting the appropriate flag which represents the present stage location of the viewing microscope at step 90. In this manner, a user-defined flag is mapped onto the feature map 22.
Although the invention has been shown and described with respect to a certain preferred embodiment or embodiments, it is obvious that equivalent alterations and modifications will occur to others skilled in the art upon the reading and understanding of the specification and the annexed drawings. In particular regard to the various functions performed by the above-described integers (components, assemblies, devices, compositions, etc.), the terms (including a reference to any "means") used to describe such integers are intended to correspond, unless otherwise indicated, to any integer which performs the specified function of the described integer (i.e., that is functionally equivalent), even though in structurally equivalent to the disclosed structure which performs the function in the herein illustration exemplary embodiment or embodiments of the invention. In addition, while a particular feature of the invention may have been described above with respect to only one of several illustrated embodiments, such feature may be combined with one or more other features of the other embodiments if may be desired and advantageous for any given or particular application.

Claims

WHAT IS CLAIMED IS:
1 . A method of displaying a user-defined mapping of a " medical specimen, comprising the steps of: creating a feature map of the medical specimen; mapping user-defined locations onto the feature map, thereby creating a custom feature map; and displaying the custom feature map.
2. The method of claim 1 , wherein the step of creating a feature map comprises the steps of: mapping the position of a cover slip with respect to a slide containing the medical specimen into a memory; mapping the position of air bubbles located between the cover slip and the slide into the memory; mapping the position of the specimen on the slide into the memory; evaluating the specimen; ranking portions of the scanned specimen to identify portions of the scanned specimen that have the greatest risk of abnormality; and mapping the positions of the portions of the scanned specimen that have the greatest risk of abnormality into the memory.
3. The method of claim 1 , wherein the step of mapping user-defined locations onto the feature map comprises the steps of: selecting at least one image to be viewed from a plurality of images, wherein the plurality of images represent portions of the medical specimen that have the greatest risk of abnormality; identifying locational indices of the at least one image on a slide containing the medical specimen; and modifying the feature map by marking the location of the at least one image that corresponds to the locational indices of the at least one image on the slide.
4. The method of claim 3, wherein the step of modifying the feature map comprises the steps of: converting the locational indices of the at least one image on the slide to feature map coordinates; and modifying at least one feature map pixel corresponding to the feature map coordinates, thereby modifying the feature map.
5. A system for displaying a user-defined mapping of a medical specimen, comprising: a processor for receiving data representing a medical specimen on an examination fixture; a display coupled to the processor for displaying selected visual images representative of portions of the medical specimen and an image of the entire specimen on the fixture; and an input device coupled to the processor for identifying particular images on the display, wherein the input device is operable to send a signal indicative of identified images to the processor, and wherein the processor effectuates a locational indication of the identified image on the display.
6. The system of claim 5, further comprising a microscope having a viewing stage coupled to the processor for reviewing the medical specimen concurrent with the display of selected visual images, wherein the microscope is operable to communicate locational indices of the viewing stage with respect to a slide containing the medical specimen, and wherein the processor effectuates a locational indication of the microscope with respect to the slide on the display.
7. The system of claim 5, further comprising a scanner coupled to the processor, wherein the scanner is operable to scan the medical specimen and create an image file corresponding to the scanned medical specimen.
8. The system of claim 7, wherein the image file comprises a plurality of images corresponding to the scanned specimen, locational indices which correspond to the location of each of the plurality of images on the slide containing the medical specimen, and a flag corresponding to each of the plurality of images, wherein the flag indicates a characteristic of each of the plurality of images.
9. The system of claim 8, wherein the characteristics of each of the plurality of flags comprises whether a particular image of the plurality of images is a region outside a coverslip, an air bubble, a region inside the coverslip but not containing medical specimen material, a region inside the coverslip containing medical specimen material, a region ranked by the scanner as regions having the greatest risk of abnormality, a region selected by a user for further review, or a region presently being viewed by the microscope.
10. A method of mapping user-defined locations from a plurality of display images onto a feature map, comprising the steps of: selecting at least one image of interest from the plurality of display images; identifying slide coordinates associated with the selected at least one image of interest, wherein the slide coordinates define the location on a slide where a piece of matter corresponding to the at least one image of interest resides; converting the slide coordinates to feature map coordinates, wherein the feature map coordinates define the location on a graphical image that represents a graphical reproduction of the slide; and modifying feature map data at the location dictated by the feature map coordinates to indicate a selected image of interest.
1 1 . The method of claim 10, wherein the step of selecting at least one image of interest from a plurality of display images comprises the steps of: viewing the plurality of display images on a monitor; identifying at least one image of interest from the plurality of images on the monitor; and clicking on the identified at least one image of interest with a mouse, thereby communicating to a computer the selection.
1 2. The method of claim 1 0, wherein the step of identifying slide coordinates associated with the selected at least one image of interest comprises: accessing an image file that correlates each of the plurality of display images to their corresponding slide coordinates; and reading the slide coordinates that correspond to the at least one image of interest.
1 3. The method of claim 10, wherein the step of converting the slide coordinates to feature map coordinates comprises the steps of: determining the total number of slide coordinates on a slide, wherein the number of slide coordinates corresponds to the total number of images taken of the slide; determining the display resolution of the feature map; creating a ratio of the total number of slide coordinates and the display resolution of the feature map; and converting the slide coordinates to feature map coordinates using the ratio.
14. A method of mapping a location of a viewing apparatus with respect to a viewing specimen onto a feature map, comprising the steps of: identifying the positional control indices of the viewing apparatus; calculate the slide coordinates that correspond to the positional control indices, thereby identifying the slide coordinates associated with the location of the viewing apparatus; converting the slide coordinates to feature map coordinates, wherein the feature map coordinates define the location on a graphical image that represents a graphical reproduction of the slide; and modifying feature map data at the location dictated by the feature map coordinates to indicate the location of the viewing apparatus with respect to the slide.
1 5. A method of creating a user-defined mapping of a medical specimen on a slide, comprising the steps of: identifying portions of the medical specimen for further review and their positional indices with respect to the slide, the identified portions being portions of the specimen having the greatest risk of abnormality; assigning the identified portions of the medical specimen with a first flag; displaying the identified portions of the medical specimen, thereby creating a plurality of displayed images for review; identifying one or more of the displayed images; assigning the identified displayed images with a second flag; converting the positional indices of the identified portions of the medical specimen and the identified displayed images to feature map coordinates, wherein the feature map coordinates define the location on a graphical image that represents a graphical reproduction of the slide; modifying feature map data according to the first flag of the identified portions of the medical specimen; modifying feature map data according to the second flag of the identified displayed images, thereby creating a custom feature map; and displaying the custom feature map.
1 6. A method of displaying a user-defined mapping of a medical specimen, comprising the steps of: creating a feature map of the medical specimen; mapping user-defined locations onto the feature map, thereby creating a custom feature map; and displaying the custom feature map.
17. The method of claim 1 6, wherein the step of creating a feature map comprises the steps of: mapping the position of a cover slip with respect to a slide containing the medical specimen into a memory; mapping the position of air bubbles located between the cover slip and the slide into the memory; mapping the position of the specimen on the slide into the memory; evaluating the specimen; ranking portions of the scanned specimen to identify portions of the scanned specimen that have the greatest risk of abnormality; and mapping the positions of the portions of the scanned specimen that have the greatest risk of abnormality into the memory.
18. The method of claim 16, wherein the step of mapping user-defined locations onto the feature map comprises the steps of: selecting at least one image to be viewed from a plurality of images, wherein the plurality of images represent portions of the medical specimen that have the greatest risk of abnormality; identifying locational indices of the at least one image on a slide containing the medical specimen; and modifying the feature map by marking the location of the at least one image that corresponds to the locational indices of the at least one image on the slide.
1 9. The method of claim 1 8, wherein the step of modifying the feature map comprises the steps of: converting the locational indices of the at least one image on the slide to feature map coordinates; and modifying at least one feature map pixel corresponding to the feature map coordinates, thereby modifying the feature map.
20. The method of any of claims 1 6-1 9, wherein the specimen is a medical specimen.
21 . A system for displaying a user-defined mapping of a medical specimen, comprising: a processor for receiving data representing a medical specimen on an examination fixture; a display coupled to the processor for displaying selected visual images representative of portions of the medical specimen and an image of the entire specimen on the fixture; and an input device coupled to the processor for identifying particular images on the display, wherein the input device is operable to send a signal indicative of identified images to the processor, and wherein the processor effectuates a locational indication of the identified image on the display.
22. The system of claim 21 , further comprising a microscope having a viewing stage coupled to the processor for reviewing the medical specimen concurrent with the display of selected visual images, wherein the microscope is operable to communicate locational indices of the viewing stage with respect to a slide containing the medical specimen, and wherein the processor effectuates a locational indication of the microscope with respect to the slide on the display.
23. The system of claim 21 , further comprising a scanner coupled to the processor, wherein the scanner is operable to scan the medical specimen and create an image file corresponding to the scanned medical specimen.
24. The system of claim 23, wherein the image file comprises a plurality of images corresponding to the scanned specimen, locational indices which correspond to the location of each of the plurality of images on the slide containing the medical specimen, and a flag corresponding to each of the plurality of images, wherein the flag indicates a characteristic of each of the plurality of images.
25. The system of claim 24, wherein the characteristics of each of the plurality of flags comprises whether a particular image of the plurality of images is a region outside a coverslip, an air bubble, a region inside the coverslip but not containing medical specimen material, a region inside the coverslip containing medical specimen material, a region ranked by the scanner as regions having the greatest risk of abnormality, a region selected by a user for further review, or a region presently being viewed by the microscope.
26. The system of any of claims 21 -25, wherein the specimen is a medical specimen.
27. A method of creating a user-defined mapping of a medical specimen on a slide, comprising the steps of: identifying portions of the medical specimen for further review and their positional indices with respect to the slide, the identified portions being portions of the specimen having the greatest risk of abnormality; assigning the identified portions of the medical specimen with a first flag; displaying the identified portions of the medical specimen, thereby creating a plurality of displayed images for review; identifying one or more of the displayed images; assigning the identified displayed images with a second flag; converting the positional indices of the identified portions of the medical specimen and the identified displayed images to feature map coordinates, wherein the feature map coordinates define the location on a graphical image that represents a graphical reproduction of the slide; modifying feature map data according to the first flag of the identified portions of the medical specimen; modifying feature map data according to the second flag of the identified displayed images, thereby creating a custom feature map; and displaying the custom feature map.
28. The method of claim 27, wherein the specimen is a medical specimen.
29. A method of displaying a user-defined mapping of a specimen, comprising the steps of: creating a map representing of the specimen or of characteristics or features thereof; mapping user-defined locations onto the map, thereby creating a custom map; and displaying the custom map.
30. The method of claim 29, wherein the map is a map of features of the specimen.
31 . The method of either of claims 29 or 30, wherein the specimen is a medical specimen.
32. A method of mapping user-defined locations from a plurality of display images onto a feature map, comprising the steps of: selecting at least one image of interest from the plurality of display images; identifying position coordinates associated with the selected at least one image of interest, wherein the coordinates define the location where a piece of matter corresponding to the at least one image of interest resides; converting the coordinates to feature map coordinates, wherein the feature map coordinates define the location on a graphical image that represents a graphical reproduction of the area including such location; and modifying feature map data at the location dictated by the feature map coordinates to indicate a selected image of interest.
33. The method of claim 32, wherein the area including such location is on a slide.
34. The method of claim 33, wherein the slide is of the type for placement in or with respect to a device for viewing the piece of matter.
35. A method of mapping a location of a viewing apparatus with respect to a viewing specimen onto a feature map, comprising the steps of: identifying the positional control indices of the viewing apparatus; calculate the coordinates that correspond to the positional control indices, thereby identifying the coordinates associated with the location of the viewing apparatus; converting the coordinates to feature map coordinates, wherein the feature map coordinates define the location on a graphical image that represents a graphical reproduction of the area at which the specimen is provided; and modifying feature map data at the location dictated by the feature map coordinates to indicate the location of the viewing apparatus with respect to the area at which the specimen is provided.
36. The method of claim 35, wherein the specimen is provided on a slide.
37. A method of creating a user-defined mapping of a specimen on a support, comprising the steps of: identifying portions of the specimen for further review and their positional indices with respect to the support, the identified portions being portions of the specimen having the greatest risk of abnormality; assigning the identified portions of the specimen with a first flag; displaying the identified portions of the specimen, thereby creating a plurality of displayed images for review; identifying one or more of the displayed images; assigning the identified displayed images with a second flag; converting the positional indices of the identified portions of the specimen and the identified displayed images to feature map coordinates, wherein the feature map coordinates define the location on a graphical image that represents a graphical reproduction of the support; modifying feature map data according to the first flag of the identified portions of the specimen; modifying feature map data according to the second flag of the identified displayed images, thereby creating a custom feature map; and displaying the custom feature map.
38. The method of claim 37, wherein the support is a slide.
PCT/US1998/007100 1997-04-11 1998-04-09 Specimen evaluation device WO1998047095A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU71060/98A AU7106098A (en) 1997-04-11 1998-04-09 Specimen evaluation device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US4317097P 1997-04-11 1997-04-11
US60/043,170 1997-04-11
US85161797A 1997-05-06 1997-05-06
US08/851,617 1997-05-06

Publications (1)

Publication Number Publication Date
WO1998047095A1 true WO1998047095A1 (en) 1998-10-22

Family

ID=26720111

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/007100 WO1998047095A1 (en) 1997-04-11 1998-04-09 Specimen evaluation device

Country Status (2)

Country Link
AU (1) AU7106098A (en)
WO (1) WO1998047095A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7006674B1 (en) 1999-10-29 2006-02-28 Cytyc Corporation Apparatus and methods for verifying the location of areas of interest within a sample in an imaging system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5416602A (en) * 1992-07-20 1995-05-16 Automated Medical Access Corp. Medical image system with progressive resolution
US5526258A (en) * 1990-10-10 1996-06-11 Cell Analysis System, Inc. Method and apparatus for automated analysis of biological specimens

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5526258A (en) * 1990-10-10 1996-06-11 Cell Analysis System, Inc. Method and apparatus for automated analysis of biological specimens
US5416602A (en) * 1992-07-20 1995-05-16 Automated Medical Access Corp. Medical image system with progressive resolution

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7006674B1 (en) 1999-10-29 2006-02-28 Cytyc Corporation Apparatus and methods for verifying the location of areas of interest within a sample in an imaging system

Also Published As

Publication number Publication date
AU7106098A (en) 1998-11-11

Similar Documents

Publication Publication Date Title
US7146372B2 (en) Method and apparatus for creating a virtual microscope slide
JP3877916B2 (en) Anomaly detection method and system for digital image, and storage medium therefor
AU2003237810B2 (en) Video microscopy system and multi-view virtual slide viewer capable of simultaneously acquiring and displaying various digital views of an area of interest located on a microscopic slide
US6674884B2 (en) Apparatus for remote control of a microscope
US8306298B2 (en) Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
US7636465B2 (en) Cross-frame object reconstruction for image-based cytology applications
WO1998044446A9 (en) Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
US20030033090A1 (en) Managing images of microscopic specimens
US6968079B2 (en) Investigation device and investigation method
US11416997B2 (en) Systems, methods, and apparatuses for image capture and display
CA2683134A1 (en) System and method for the automated analysis of cellular assays and tissues
US7590492B2 (en) Method and system for organizing multiple objects of interest in field of interest
JP5600584B2 (en) Computer-readable recording medium storing data structure
WO1998047095A1 (en) Specimen evaluation device
Kloster et al. 2.4 Publication II–Large-Scale Permanent Slide Imaging and Image Analysis for Diatom Morphometrics
MXPA05013442A (en) System for organizing multiple objects of interest in field of interest

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM GW HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 1998544044

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase