US20030028113A1 - Ultrasound scan conversion with spatial dithering - Google Patents

Ultrasound scan conversion with spatial dithering Download PDF

Info

Publication number
US20030028113A1
US20030028113A1 US10/114,183 US11418302A US2003028113A1 US 20030028113 A1 US20030028113 A1 US 20030028113A1 US 11418302 A US11418302 A US 11418302A US 2003028113 A1 US2003028113 A1 US 2003028113A1
Authority
US
United States
Prior art keywords
data
data processing
interface unit
interface
probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/114,183
Inventor
Jeffrey Gilbert
Alice Chiang
Steven Broadstone
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TeraTech Corp
Original Assignee
TeraTech Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/496,804 external-priority patent/US5590658A/en
Priority claimed from US08/496,805 external-priority patent/US5839442A/en
Priority claimed from PCT/US1996/011166 external-priority patent/WO1997001768A2/en
Priority claimed from US08/773,647 external-priority patent/US5904652A/en
Application filed by TeraTech Corp filed Critical TeraTech Corp
Priority to US10/114,183 priority Critical patent/US20030028113A1/en
Publication of US20030028113A1 publication Critical patent/US20030028113A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52044Scan converters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8934Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8979Combined Doppler and pulse-echo imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52025Details of receivers for pulse systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/5206Two-dimensional coordinated display of distance and direction; B-scan display
    • G01S7/52063Sector scan display
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52073Production of cursor lines, markers or indicia by electronic means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52074Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/5208Constructional features with integration of processing functions inside probe or scanhead
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0443Modular apparatus
    • A61B2560/045Modular apparatus with a separable interface unit, e.g. for communication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/02Stethoscopes
    • A61B7/04Electric stethoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4455Features of the external shape of the probe, e.g. ergonomic aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4488Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/899Combination of imaging systems with ancillary equipment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52025Details of receivers for pulse systems
    • G01S7/52026Extracting wanted echo signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52034Data rate converters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52046Techniques for image enhancement involving transmitter or receiver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features

Definitions

  • Conventional ultrasound imaging systems typically include a hand-held scan head coupled by a cable to a large rack-mounted console processing and display unit.
  • the scan head typically includes an array of ultrasonic transducers which transmit ultrasonic energy into a region being imaged and receive reflected ultrasonic energy returning from the region.
  • the transducers convert the received ultrasonic energy into low-level electrical signals which are transferred over the cable to the processing unit.
  • the processing unit applies appropriate beam forming techniques such as dynamic focusing to combine the signals from the transducers to generate an image of the region of interest.
  • Typical conventional ultrasound systems include transducer arrays having a plurality, for example 128 , of ultrasonic transducers. Each transducer is associated with its own processing circuitry located in the console processing unit.
  • the processing circuitry typically includes driver circuits which, in the transmit mode, send precisely timed drive pulses to the transducer to initiate transmission of the ultrasonic signal. These transmit timing pulses are forwarded from the console processing unit along the cable to the scan head.
  • beam forming circuits of the processing circuitry introduce the appropriate delay into each low-level electrical signal from the transducers to dynamically focus the signals such that an accurate image can subsequently be generated.
  • the ultrasound signal is received and digitized in its natural polar (r, ⁇ ) form.
  • this representation is inconvenient, so it is converted into a rectangular (x,y) representation for further processing.
  • the rectangular representation is digitally corrected for the dynamic range and brightness of various displays and hard-copy devices.
  • the data can also be stored and retrieved for redisplay.
  • the (x,y) values must be computed from the (r, ⁇ ) values because the points on the (r, ⁇ ) array and the rectangular (x,y) grid are not coincident.
  • each point on the (x,y) grid is visited and its value is computed from the values of the two nearest 6 values by linear interpolation or the four nearest neighbors on the (r, ⁇ ) array by bi-linear interpolation.
  • This is accomplished by use of a finite state machine to generate the (x,y) traversal pattern, a bi-directional shift register to hold the (r, ⁇ ) data samples in a large number of digital logic and memory units to control the process and ensure that the correct asynchronously received samples of (r, ⁇ ) data arrive for interpolation at the right time for each (x,y) point.
  • This prior implementation can be both inflexible and unnecessarily complex. Despite the extensive control hardware, only a single path through the (x,y) array is possible.
  • scan data is directed into a computer after beamforming and scan conversion is performed to convert the scan data into a display format.
  • scan conversion can be performed entirely using a software module on a personal computer.
  • a board with additional hardware can be inserted to provide selected scan conversion functions or to perform the entire scan conversion process.
  • the software system is preferred as additional hardware is minimized so the personal computer can be a small portable platform, such as a laptop or palmtop computer.
  • Scan conversion is preferably performed using a spatial dithering process described in greater detail below. Spatial dithering simplifies the computational requirements for scan conversion while retaining image resolution and quality. Thus, scan conversion can be performed on a personal computer without the need for more complex interpolation techniques and still provide conversion at frame rates suitable for real time ultrasound imaging.
  • the scan conversion procedure includes an input array, a remap array, and an output array.
  • the remap array is an array of indices or pointers, which is the size of the output image used to determine where to get each pixel from the input array.
  • the numbers in each position in the remap array indicate where in the input data to take each pixel will go into the output array in the same position.
  • the remap array and output array can be thought of as having the same geometry while the input array and output array have the same type of data, i.e., actual image data.
  • the input array has new data for each ultrasound frame, which means that it processes the data and puts the data in the output array on every frame.
  • the DSP can perform the computations of the remap array.
  • certain scan conversion functions can be performed by hardware inserted into the personal computer on a circuit board.
  • This board or a card can be inserted and used as an interface to deliver data in the proper form to the PC bus controller.
  • FIG. 1 is a block diagram of a conventional imaging array as used in an ultrasound imaging system.
  • FIG. 2A is a schematic illustration of the relationship between a linear ultrasound transducer array and a rectangular scan region in accordance with the present invention.
  • FIG. 2B is a schematic illustration of the relationship between a curved linear ultrasound transducer array and a curved scan region in accordance with the present invention.
  • FIG. 2C is a schematic illustration of the relationship between a linear ultrasound transducer array and a trapezoidal scan region in accordance with the present invention.
  • FIG. 2D is a schematic illustration of a phased array scan region.
  • FIG. 3 is a schematic pictorial view of a preferred embodiment of the ultrasound imaging system of the present invention.
  • FIG. 4A is a schematic functional block diagram of a preferred embodiment of the ultrasound imaging system of the invention.
  • FIG. 4B is a schematic functional block diagram of an alternative preferred embodiment of the ultrasound imaging system of the invention.
  • FIG. 5A is a schematic diagram of a beamforming and filtering circuit in accordance with the invention.
  • FIG. 5B is a schematic diagram of another preferred embodiment of a beamforming and filtering circuit in accordance with the invention.
  • FIG. 5C is a schematic diagram of another preferred embodiment of a beamforming and filtering circuit in accordance with the invention.
  • FIG. 5D is a schematic diagram of a low pass filter in accordance with the invention.
  • FIG. 5E is an example of an interface circuit board in accordance with the invention.
  • FIG. 5F is a preferred embodiment of an integrated beamforming circuit in accordance with the inventions.
  • FIG. 6 is a graphical illustration of the passband of a filter in accordance with the invention.
  • FIG. 7A is a schematic diagram of input points overlayed on a display.
  • FIG. 7B is a schematic diagram of a display of FIG. 6 having input data converted to pixels.
  • FIG. 8 is a schematic diagram of a preferred embodiment of a general purpose image remapping architecture.
  • FIGS. 9 A- 9 B are a flow chart illustrating a remap array computation technique in accordance with the invention.
  • FIG. 10 is a flow chart of an output frame computation engine.
  • FIGS. 11 A- 11 B are schematic pictorial views of two user-selectable display presentation formats used in the ultrasound imaging system of the invention.
  • FIGS. 12 A- 12 B are functional block diagrams of a preferred graphical user interface.
  • FIG. 13 illustrates a dialog box for ultrasound image control.
  • FIGS. 14 A- 14 D illustrate display boxes for entering system information.
  • FIGS. 15 A- 15 C illustrates additional dialog boxes for entering probe or FOV data.
  • FIGS. 15 D- 15 J illustrate additional display and dialog boxes for a preferred embodiment of the invention.
  • FIG. 16 illustrates imaging and display operations of a preferred embodiment of the invention.
  • FIGS. 17 A- 17 C illustrate preferred embodiments of integrated probe systems in accordance with the invention.
  • FIG. 18 illustrates a 64 channel integrated controller of a transmit/receive circuit for an ultrasound system.
  • FIG. 19 illustrates another preferred embodiment of a transmit and receive circuit.
  • FIG. 20 illustrates a Doppler Sonogram system in accordance with the invention.
  • FIG. 21 illustrates a color flow map based on a fast fourier transform pulsed Doppler processing system in accordance with the invention.
  • FIG. 22 illustrates a processing system a waveform generation in accordance with the invention.
  • FIG. 23 is a system for generating a color flow map in accordance with the invention.
  • FIG. 24 is a process flow sequence for computing a color flow map in accordance with the invention.
  • FIG. 25 is a process flow sequence for generating a color flow map using cross correlation method.
  • FIG. 1 A schematic block diagram of an imaging array 18 of N piezoelectric ultrasonic transducers 18 ( 1 )- 18 (N) as used in an ultrasound imaging system is shown in FIG. 1.
  • the array of piezoelectric transducer elements 18 ( 1 )- 18 (N) generate acoustic pulses which propagate into the image target (typically a region of human tissue) or transmitting media with a narrow beam 180 .
  • the pulses propagate as a spherical wave 185 with a roughly constant velocity.
  • Acoustic echoes in the form of returning signals from image points I p or reflectors are detected by the same array 18 of transducer elements, or another receiving array and can be displayed in a fashion to indicate the location of the reflecting structure.
  • the acoustic echo from the image point I p in the transmitting media reaches each transducer element 18 ( 1 )- 18 (N) of the receiving array after various propagation times.
  • the propagation time for each transducer element is different and depends on the distance between each transducer element and the image point I p . This holds true for typical ultrasound transmitting media, i.e. soft bodily tissue, where the velocity of sound is at least relatively constant.
  • the received information is displayed in a manner to indicate the location of the reflecting structure.
  • the pulses can be transmitted along a number of lines-of-sight as shown in FIG. 1. If the echoes are sampled and their amplitudes are coded as brightness, a grey scale image can be displayed on a cathode ray tube (CRT) or monitor. An image typically contains 128 such scanned lines at 0.75° angular spacing, forming a 90° sector image. Because the velocity of sound in water is 1.54 ⁇ 10 5 cm/sec, the round-trip time to a depth of 16 cm will be 208 ⁇ s. Thus, the total time required to acquire data along 128 lines of sight (for one image) is 26.6 ms.
  • CTR cathode ray tube
  • two-dimensional images can be produced at rates corresponding to standard television video.
  • the ultrasound imager is used to view reflected or back scattered sound waves through the chest wall between a pair of ribs, the heart pumping can be imaged in real time.
  • the ultrasonic transmitter is typically a linear array of piezoelectric transducers 18 ( 1 )- 18 (N) (typically spaced half-wavelength apart) for steered arrays whose elevation pattern is fixed and whose azimuth pattern is controlled primarily by delay steering.
  • the radiating (azimuth) beam pattern of a conventional array is controlled primarily by applying delayed transmitting pulses to each transducer element 18 ( 1 )- 18 (N) in such a manner that the energy from all the transmitters summed together at the image point I p produces a desired beam shape. Therefore, a time delay circuit is needed in association with each transducer element 18 ( 1 )- 18 (N) for producing the desired transmitted radiation pattern along the predetermined direction.
  • the same array 18 of transducer elements 18 ( 1 )- 18 (N) can be used for receiving the return signals.
  • the reflected or echoed beam energy waveform originating at the image point reaches each transducer element after a time delay equal to the distance from the image point to the transducer element divided by the assumed constant speed of the propagation of waves in the media. Similar to the transmitting mode, this time delay is different for each transducer element.
  • these differences in path length should be compensated for by focusing the reflected energy at each receiver from the particular image point for any given depth.
  • the delay at each receiving element is a function of the distance measured from the element to the center of the array and the viewing angular direction measured normal to the array.
  • the beam forming and focusing operations involve forming a sum of the scattered waveforms as observed by all the transducers, but in this sum, the waveforms must be differentially delayed so they will all arrive in phase and properly weighted in the summation.
  • a beam forming circuit is required which can apply a different delay on each channel, and vary that delay with time.
  • the receiving array varies its focus continually with depth. This process is known as dynamic focusing.
  • each received pulse is digitized in a conventional manner.
  • the digital representation of each received pulse is a time sequence corresponding to a back-scattering cross section of ultrasonic energy returning from a field point as a function of range at the azimuth formed by the beam. Successive pulses are pointed in different directions, covering a field of view from ⁇ 45° to +45°.
  • time averaging of data from successive observations of the same point (referred to as persistence weighting) is used to improve image quality.
  • FIGS. 2 A- 2 D are schematic diagrams illustrating the relationship between the various transducer array configurations used in the present invention and their corresponding scan image regions.
  • FIG. 2A shows a linear array 18 A which produces a rectangular scanning image region 180 A.
  • Such an array typically includes 128 transducers.
  • FIG. 2B is a schematic diagram showing the relationship between a curved linear transducer array 18 B and the resulting sectional curved image scan region 180 B.
  • the array 18 B typically includes 128 adjacent transducers.
  • FIG. 2C shows the relationship between a linear transducer array 18 C and a trapezoidal image region 180 C.
  • the array 18 C is typically formed from 192 adjacent transducers, instead of 128 .
  • the linear array is used to produce the trapezoidal scan region 180 C by combining linear scanning as shown in FIG. 2A with phased array scanning.
  • the 64 transducers on opposite ends of the array 18 C are used in a phased array configuration to achieve the curved angular portions of the region 180 C at its ends.
  • the middle 64 transducers are used in the linear scanning mode to complete the rectangular portion of the region 180 C.
  • the trapezoidal region 180 C is achieved using a sub-aperture scanning approach in which only 64 transducers are active at any one time.
  • adjacent groups of 64 transducers are activated alternately. That is, first, transducers 1 - 64 become active. Next, transducers 64 - 128 become active. In the next step, transducers 2 - 65 are activated, and then transducers 65 - 129 are activated. This pattern continues until transducers 128 - 192 are activated. Next, the scanning process begins over again at transducers 1 - 64 .
  • FIG. 2D shows a short linear array of transducers 18 D used to perform phased array imaging in accordance with the invention.
  • the linear array 18 D is used via phased array beam steering processing to produce an angular slice region 180 D.
  • FIG. 3 is a schematic pictorial view of an ultrasound imaging system 10 of the present invention.
  • the system includes a hand-held scan head 12 coupled to a portable data processing and display unit 14 which can be a laptop computer.
  • the data processing and display unit 14 can include a personal computer or other computer interfaced to a CRT for providing display of ultrasound images.
  • the data processor display unit 14 can also be a small, lightweight, single-piece unit small enough to be hand-held or worn or carried by the user.
  • FIG. 3 shows an external scan head
  • the scan head of the invention can also be an internal scan head adapted to be inserted through a lumen into the body for internal imaging.
  • the head can be a transesophogeal probe used for cardiac imaging.
  • the scan head 12 is connected to the data processor 14 by a cable 16 .
  • the system 10 includes an interface unit 13 (shown in phantom) coupled between the scan head 12 and the data processing and display unit 14 .
  • the interface unit 13 preferably contains controller and processing circuitry including a digital signal processor (DSP).
  • DSP digital signal processor
  • the interface unit 13 can perform required signal processing tasks and can provide signal outputs to the data processing unit 14 and/or scan head 12 .
  • the interface unit 13 is preferably an internal card or chip set. When used with a desktop or laptop computer, the interface unit 13 can instead be an external device.
  • the hand-held housing 12 includes a transducer section 15 A and a handle section 15 B.
  • the transducer section 15 A is maintained at a temperature below 41° C. so that the portion of the housing that is in contact with the skin of the patient does not exceed this temperature.
  • the handle section 15 B does not exceed a second higher temperature preferably 50° C.
  • FIG. 4A is a schematic functional block diagram of one embodiment of the ultrasound imaging system 10 of the invention.
  • the scan head 12 includes an ultrasonic transducer array 18 which transmits ultrasonic signals into a region of interest or image target 11 , such as a region of human tissue, and receives reflected ultrasonic signals returning from the image target.
  • the scan head 12 also includes transducer driver circuitry 20 and pulse synchronization circuitry 22 .
  • the pulse synchronizer 22 forwards a series of precisely timed and delayed pulses to high voltage driver circuits in the drivers 20 .
  • the high-voltage driver circuits are activated to forward a high-voltage drive signal to each transducer in the transducer array 18 to activate the transducer to transmit an ultrasonic signal into the image target 11 .
  • Ultrasonic echoes reflected by the image target 11 are detected by the ultrasonic transducers in the array 18 .
  • Each transducer converts the received ultrasonic signal into a representative electrical signal which is forwarded to preamplification circuits 24 and time-varying gain control (TGC) circuitry 25 .
  • the preamp circuitry 24 sets the level of the electrical signals from the transducer array 18 at a level suitable for subsequent processing, and the TGC circuitry 25 is used to compensate for attenuation of the sound pulse as it penetrates through human tissue and also drives the beam forming circuits 26 (described below) to produce a line image.
  • the conditioned electrical signals are forwarded to the beam forming circuitry 26 which introduces appropriate differential delay into each of the received signals to dynamically focus the signals such that an accurate image can be created. Further details of the beam forming circuitry 26 and the delay circuits used to introduce differential delay into received signals and the pulses generated by the pulse synchronizer 22 are described in the incorporated International Application PCT/US96/11166.
  • the dynamically focused and summed signal is forwarded to an A/D converter 27 which digitizes the summed signal. Digital signal data is then forwarded from the A/D 27 over the cable 16 to a color doppler processing circuit 36 . It should be noted that the A/D converter 27 is not used in an alternative embodiment in which the analog summed signal is sent directly over the system cable 16 .
  • the digital signal is also demodulated in a demodulation circuit 28 and forwarded to a scan conversion circuit 37 in the data processor and display unit 14 .
  • a scan head memory 29 stores data from a controller 21 and the data processing and display unit 14 .
  • the scan head memory 29 provides stored data to the pulse synchronize 22 , the TGC 25 and the beam former 26 .
  • the scan conversion circuitry 37 converts the digitized signal data from the beam forming circuitry 26 from polar coordinates (r, ⁇ ) to rectangular coordinates (x,y). After the conversion, the rectangular coordinate data can be forwarded to an optional post signal processing stage 30 where it is formatted for display on the display 32 or for compression in a video compression circuit 34 .
  • the post processing 30 can also be performed using the scan conversion software described hereinafter.
  • Digital signal data from the A/D connector 27 is received by a pulsed or continuous Doppler processor 36 in the data processor unit 14 .
  • the pulsed or continuous Doppler processor 36 generates data used to image moving target tissue 11 such as flowing blood.
  • a color flow map is generated.
  • the pulsed Doppler processor 36 forwards its processed data to the scan conversion circuitry 28 where the polar coordinates of the data are translated to rectangular coordinates suitable for display or video compression.
  • a control circuit preferably in the form of a microprocessor 38 inside of a personal computer (e.g., desktop, laptop, palmtop), controls the high-level operation of the ultrasound imaging system 10 .
  • the microprocessor 38 or a DSP initializes delay and scan conversion memory.
  • the control circuit 38 controls the differential delays introduced in both the pulsed synchronizer 22 and the beam forming circuitry 26 via the scan head memory 27 .
  • the microprocessor 38 also controls a memory 40 which stores data used by the scan conversion circuitry 28 . It will be understood that the memory 40 can be a single memory or can be multiple memory circuits.
  • the microprocessor 38 also interfaces with the post signal processing circuitry 30 and the video compression circuitry 34 to control their individual functions.
  • the video compression circuitry 34 compresses data to permit transmission of the image data to remote stations for display and analysis via a transmission channel.
  • the transmission channel can be a modem or wireless cellular communication channel or other known communication method.
  • the portable ultrasound imaging system 10 of the invention can preferably be powered by a battery 44 .
  • the raw battery voltage out of the battery 44 drives a regulated power supply 46 which provides regulated power to all of the subsystems in the imaging system 10 including those subsystems located in the scan head 12 .
  • power to the scan head can be provided from the data processing and display unit 14 over the cable 16 .
  • FIG. 4B is a schematic functional block diagram of an alternative preferred embodiment of the ultrasound imaging system of the invention.
  • demodulation circuitry is replaced by software executed by the microprocessor 38 in a modified data processing and display unit 14 ′.
  • the digital data stream from the A/D converter 27 is buffered by a FIFO memory 37 .
  • the microprocessor executes software instruction to demodulate, perform scan conversion, color doppler processing, post signal processing and video compression.
  • FIG. 4A many hardware functions of FIG. 4A are replaced by software stored in memory 40 in FIG. 4B, reducing hardware size and weight requirements for the system 10 ′.
  • FIGS. 5A, 5B, and 5 C Additional preferred embodiments for beam forming circuitry of ultrasound systems are depicted in FIGS. 5A, 5B, and 5 C.
  • Each of these implementations requires that sampled-analog data be down-converted, or mixed, to a baseband frequency from an intermediate frequency (IF).
  • the down-conversion or mixing is accomplished by first multiplying the sampled data by a complex value (represented by the complex-valued exponential input to the multiplier stage), and then filtering the data to reject images that have been mixed to nearby frequencies.
  • the outputs of this processing are available at a minimum output sample rate and are available for subsequent display or Doppler processing.
  • a set of sampling circuits 56 is used to capture a data 54 represented by a packets of charge in a CCD-based processing circuit fabricated on an integrated circuit 50 .
  • Data are placed in one or more delay lines and output, at appropriate times using memory and control circuitry 62 , programmable delay circuits 58 , to an optional interpolation filter 60 .
  • the interpolation filter can be used to provide refined estimates of the round-trip time of a sound wave and thereby provide better focus of the returned signals from an array of sensors.
  • two processing channels 52 of an array of processors, are depicted. The outputs from the interpolation filters are combined, at an analog summing junction 66 , to provide a datum of beamformed output from the array.
  • Data obtained using an ultrasound transducer resembles the output of a modest-bandwidth signal modulated by the center frequency of the transducer.
  • the center frequency, or characteristic frequency, of the transducer is equivalent to the IF.
  • 2 ⁇ f I /f s , where fi is the intermediate frequency and f I is the sampling frequency.
  • the outputs of the multiplier 68 are termed, in-phase (I) or quadrature (Q) samples. In general, both I and Q values will be non zero.
  • the multiplier output will only produce either I or Q values in a repeating sequence, I, Q, ⁇ I, ⁇ Q, I, Q, ⁇ I . . .
  • the input data are only scaled by 1 and ⁇ 1.
  • the output data are a[0], j*a[1], ⁇ a[2].
  • the output data are a[0], j*a[1], ⁇ a[2], ⁇ j*a[3], a[4], . . . , a[n]
  • the output data are a[0], j*a[1], ⁇ a[2], ⁇ j*[3], a [4], . . .
  • the I and Q outputs 74 , 76 are each low-pass filtered 70 , 72 to reject signal images that are mixed into the baseband.
  • the coefficients of the low-pass filters can be designed using a least-mean square (LMS or L2-norm) or Chebyshev (L-infinity norm) criteria. In practice, it is desirable to reduce the number of coefficients necessary to obtain a desired filter characteristic as much as possible.
  • FIG. 5D An example of a CCD implementation of a low-pass filter is illustrated in FIG. 5D.
  • the device 90 consists of a 13-state tapped delay line with five fixed-weight multipliers 94 to implement the filter coefficients.
  • the ripple in the passband is under 0.5 dB and the stopband attenuation is less than ⁇ 30 dB of full scale.
  • the output of the low-pass filters are then decimated 78 by at least a factor of 2. Decimation greater than 2 may be warranted if the bandwidth of the ultrasound signal is bandlimited to significantly less than half the sampling frequency. For most ultrasound signals, a decimation factor greater than 2 is often used because the signal bandwidth is relatively narrow relative to the sampling frequency.
  • the order of the decimation and the low-pass filters may be interchanged to reduce the clocking frequency of the low-pass filters.
  • the coefficients for the I and Q low-pass filters can be chosen such that each filter only accepts every other datum at its input.
  • This “alternating clock” scheme permits the layout constraints to be relaxed when a decimation rate of 2 is chosen. These constraints can be further relaxed if the decimation factor is greater than 2 (i.e., when the signal bandwidth ⁇ f/2).
  • the down-converted output data are passed on for further processing that may include signal-envelope detection or Doppler processing.
  • the signal envelope also referred to as the signal magnitude
  • the I and Q data are often the inputs to Doppler processing which also uses the signal envelope to extract information in the positive- and/or negative-frequency sidebands of the signal.
  • Doppler processing which also uses the signal envelope to extract information in the positive- and/or negative-frequency sidebands of the signal.
  • only one down-conversion stage is required following the ultrasound beamforming.
  • FIG. 5B a down-conversion stage has been placed in each processing channel 52 following the sampling circuits 56 .
  • the production of I and Q data 86 , 88 is performed exactly as before, however, much sooner in the system.
  • the primary advantage of this approach is that the data rate in each processing channel can be reduced to a minimum, based on the ultrasound signal bandwidth and hence the selection of the low-pass filter and decimation factor.
  • all processing channels 52 will use the same complex-value multipliers and identical coefficients and decimation factors in the filter stage.
  • complex-valued data are delayed and interpolated to provide beamformed output.
  • FIG. 5C The ultrasound front end depicted in FIG. 5C is nearly identical to that in FIG. 5B.
  • the interpolation stage 85 , 87 has been removed and replaced by choosing unique values in the complex-valued multipliers to provide a more-precise estimate of the processing-channel delay.
  • This approach has the disadvantage that the output of the multiplier will always exhibit I and Q values that are non zero. This is a consequence of the varying sampling rate around the unit circle, in a complex-plane diagram, of the multiplier input.
  • this approach can provide a more precise estimate of the sample delay in each channel, but at the expense of producing fully complex-valued data at the output of each processing channel.
  • This modification may require more post-processing for envelope and Doppler detection than that presented in the previous implementations.
  • a preferred embodiment of a system used to interface between the output of the beamforning or filtering circuit and the computer is to provide a plug in board or card (PCMCIA) for the computer.
  • PCMCIA plug in board or card
  • the board 700 of FIG. 5E illustrates an embodiment in which 16 bits of digital beamformed data are received over the cable from the scanhead by differential receivers 702 .
  • a clock signal is also received at registers 704 along with converted differential data.
  • the first gate array 708 converts the 16 bits to 32 bits at half the data rate.
  • the 32 bit data is clocked into the FIFO 712 which outputs add-on data 716 .
  • the second gate array 710 has access to all control signals and outputs 714 to the PCI bus controller. This particular example utilizes 16 bits of data, however, this design can also be adapted for 32 bits or more.
  • a card suitable for insertion in a slot or port of a personal computer, laptop or palmtop computer can also be used.
  • the differential receivers input to registers, which deliver data to the FIFO and then to a bus controller that is located on the card. The output from the controller is connected directly to the PCI bus of the computer.
  • An alternative to the use of differential drivers and receivers to interconnect the scan head to the interface board or card is to utilize the IEEE 1394 standard cable also known as “firewire.”
  • FIG. 5F An example of a preferred embodiment of an integrated beamforming circuit 740 is illustrated in FIG. 5F.
  • the circuit 740 includes a timing circuit 742 , and 5 delay circuits 760 attached to each side of summing circuit 754 .
  • Each circuit 760 includes a sampling circuit 746 , a CCD delay line 752 , a control and memory circuit 750 , a decoder 748 , and a clocking driver circuit 744 .
  • the circuitry is surrounded by contact pads 756 to provide access to the chip circuitry.
  • the integrated circuit is preferably less than 20 square millimeters in area and can be mounted on a single board in the scan head as described in the various embodiments set forth in the above referenced incorporated application.
  • a sixteen, thirty two, or sixty four delay line integrated circuit can also be implemented utilizing a similar structure.
  • FIG. 7A is a schematic diagram of input points overlayed on a display. As illustrated, input points I p received from the ultrasound beam 180 do not exactly align with the rectangular arranged pixel points P of a conventional display 32 . Because the display 32 can only display pixelized data, the input points I p must be converted to the rectangular format.
  • FIG. 7B is a schematic diagram of a display of FIG. 6 having input data converted to pixels. As illustrated, each image point I p is assigned to a respective pixel point P on the display 32 to form an image.
  • One purpose of scan conversion is to perform the coordinate space transformation required for use with scan heads that are not flat linear, such as phased array, trapezoidal or curved linear heads. To do this, data must be read in one order and output data must be written in another order. Many existing systems must generate the transformation sequences on the fly, which reduces the flexibility and makes trapezoidal scan patterns more difficult.
  • scan conversion is reordering the data, it can also be used to rotate, pan and zoom the data. Rotation is useful for viewing the image with the scan head depicted at the top, left, right, or bottom of the image, or an arbitrary angle. Zooming and panning are commonly used to allow various parts of the image to be examined more closely.
  • irregular scan patterns can ease system design and allow greater scan head utilization. In particular, this allows reduction or hiding of dead time associated with imaging deep zones.
  • the beam is transmitted but received at some later time after the wave has had time to travel to the maximum depth and return. More efficient use of the system, and thus a higher frame rate or greater lateral sampling, can be obtained if other zones are illuminated and reconstructed during this dead time. This can cause the scan pattern to become irregular (although fixed and explicitly computed).
  • the flexible scan conversion described below corrects for this automatically.
  • FIG. 8 is a schematic diagram of a preferred embodiment of a general purpose image remapping architecture.
  • data is preferably brought directly into the PC after beamforming and the remainder of the manipulation is performed in software.
  • additional hardware is minimized so the personal computer can be a small portable platform, such as a laptop or palmtop computer.
  • the remap array 144 is an array of indices or pointers, which is the size of the output image used to determine where to get each pixel from the input array 142 .
  • the numbers in each position in the remap array 144 indicate where in the input data to take each pixel which will go into the output array 146 in the same position.
  • the remap array 144 and output array 196 can be thought of as having the same geometry while the input array 142 and output array 146 have the same type of data, i.e., actual image data.
  • the input array 142 has new data for each ultrasound frame, which means that it processes the data and puts the data in the output array 146 on every frame.
  • the remap array 144 is only updated when the head type or viewing parameters (i.e., zoom and pan) are updated. Consequently, the remap array 144 data can be generated relatively slowly (but still well under about one second or else it can become cumbersome) as long as the routine operation of computing a new output image from a new input data set is performed at the frame rate of approximately 30 frames per second.
  • DSP digital signal processor
  • input memory for the input array 142 can be either two banks of Static Random Access Memory (SRAM) or one bank of Video Random Access Memory (VRAM), where the input is serial access and the output is random access.
  • the VRAM bank may be too slow and refresh too costly.
  • the remap memory for the remap array 144 is preferably sequential access memory embodied in VRAM, or Dynamic Random Access Memory (DRAM), although random access SRAM will also work.
  • the output memory for the output array 146 can be either a frame buffer or a First-In First-Out (FIFO) buffer. Basically, the scan conversion is done on demand, on the fly. Scan conversion is preferably performed by software in the PC.
  • an architecture in accordance with the invention is preferably just two random access input buffers, a sequential access remap buffer and small (if any) FIFO or bit of pipelining for the output buffer. This implies the output frame buffer is in PC memory.
  • a spatial dithering technique employing error diffusion is used in ultrasound scan conversion. Typical dithering is done in the pixel intensity domain. In accordance with the invention, however, dithering is used in ultrasound scan conversion to approximate pixels in the spatial domain and not in the pixel intensity domain. Spatial dithering is used to approximate values that fall between two input data points. This happens because only discrete radii are sampled but pixels on the display screen can fall between two radii and need to be filtered. Spatial dithering must be used to interpolate between longitudinal sample points.
  • the remap array 144 stores the mapping of each output point to an input point.
  • the input data points are typically in polar coordinates while the output points are in rectilinear coordinates.
  • the remap array 144 merely contains indices into the input array 142 , they can be considered to contain radius (r) and angle ( ⁇ ) values. Ideally, these values have arbitrary precision and do not have to correspond to actual sampled points. Now consider that these arbitrary precision numbers must be converted into integer values.
  • the integer radius values correspond to discrete samples that were taken and are limited by the radial sampling density of the system.
  • the integer angle values correspond to discrete radial lines that were scanned and are thus limited by the number of scan angles. If spatial dithering is applied, these floating point values can be mapped into fixed integer values without having the artifacts that appear with discrete rounding without error diffusion.
  • FIGS. 9 A- 9 B are a flow chart illustrating a remap array computation technique in accordance with the invention.
  • the scan heads are checked to see if there has been any change. If the scan heads have been changed, processing continues to step 210 where the new head type is configured. After step 210 , or if there has been no change in the scan heads (step 205 ) processing continues to step 215 .
  • the display window is checked to see if there is any zooming, panning or new window-in-window feature. If so, processing continues to step 220 where the user inputs the new viewing parameters.
  • processing continues to step 225 where the remap array is cleared to indicate a new relationship between the input and output arrays.
  • the program chooses a window W to process.
  • all line error values LE and all sample error values SE are initialized to zero.
  • a point counter P is initialized to point to the top left pixel of the window W.
  • the application computes a floating point line number L FP and sample offset S FP for each point in a view V. For a phased array, this would be a radius r and an angle ⁇ .
  • any previously propagated error terms L E , S E are added to the floating point values L FP , S FP for the point P.
  • floating point terms are rounded to the nearest integer L R , S R , which correspond to actual sampled points.
  • the application computes rounding errors as:
  • L RE L FP ⁇ L R ;
  • the application computes a data index based on a scan data ordered index:
  • REMAP( P ) Index( L R , S R ).
  • step 275 a check is made to see if there are more points in the window. If there are more points to be processed, the pointer P is incremented to the next point at step 280 . Processing then returns to step 245 . Once all points in the window have been processed, processing continues to step 285 .
  • step 285 a check is made to see if there are more windows to be processed. If so, processing returns to step 230 . Otherwise, processing is done.
  • the dithering maps one source to each output pixel
  • the same remapping architecture can be used to make real-time scan conversion possible in software, even on portable computers.
  • the complicated dithering operation is only performed during initialization or when viewing parameters are changed.
  • the benefits of the dithering are present in all the images.
  • FIG. 10 is a flow chart of an output frame computation engine.
  • beamforming, demodulated input data is read into memory.
  • the output pixel index P is initialized.
  • the output array is set equal to the remapped input array according to the following:
  • step 320 the output pixel index P is incremented.
  • step 325 a check is done on the pixel index P to see if the image has been formed. If not, processing returns to step 315 . Once all the pixels in the image have been computed, processing continues to step 330 where the output image is optionally smoothed. Finally, at step 335 , the output image is displayed.
  • a low-pass spatial filter to smooth the image after the remapping process.
  • the filter can be a box filter or non-symmetrical filters can be matched to a desired input resolution characteristic. Filters can be applied in the rectilinear domain that match the orientation or angle of point coordinates at the particular location.
  • a matched filter whose extent is similar to or proportional to distances between points being dithered.
  • a high magnification is preferably accompanied by a large filter with much smoothing, whereas in places with a spacing of the sampled radius r or angle ( ⁇ ) is small (on the order of one pixel), no filtering may be required.
  • the remapping operation is basically two loads and a store, it can be performed using a standard personal computer.
  • the remapping algorithm when encoded in assembly language has been shown to work on a 166 MHZ Pentium-based PC to obtain very-near real-time operation.
  • the demodulation has been performed on the PC when written in assembly language while still achieving near real-time operation.
  • Text and graphics labels are preferably effected by storing fixed values or colors in the beginning of the input buffer and then mapping to those places where those colors are to be used. If effect, shapes or text are drawn in the remap array, which will open and automatically be overlayed on all of the images at no computational cost.
  • FIGS. 11 A- 11 B are schematic pictorial views of display formats which can be presented on the display 32 of the invention. Rather than displaying a single window of data as is done in prior ultrasound imaging systems, the system of the present invention has multiple window display formats which can be selected by the user.
  • FIG. 11A shows a selectable multi-window display in which three information windows are presented simultaneously on the display. Window A shows the standard B-scan image, while window B shows an M-scan image of a Doppler two-dimensional color flow map.
  • Window C is a user information window which communicates command selections to the user and facilitates the user's manual selections.
  • FIG. 11B is a single-window optional display in which the entire display is used to present only a B-scan image.
  • the display can show both the B-mode and color doppler scans simultaneously by overlaying the two displays or by showing them side-by-side using a split screen feature.
  • FIG. 12 is functional block diagram of a preferred graphical user interface.
  • a virtual control 400 includes an ultrasound image control display 410 , a probe model properties display 420 , and a probe specific properties display 500 .
  • the virtual control display 400 is preferably coded as dialog boxes in a Windows environment.
  • FIG. 13 illustrates a dialog box for the ultrasound image control 410 .
  • the user can select a probe head type 412 , a zone display 414 , a demodulation filter 416 , and an algorithm option 418 .
  • the user also can initiate the ultrasound scan through this dialog box.
  • the probe model properties display 420 includes model type 425 , safety information 430 , image Integrated Pulse Amplitude (IPA) data 435 , doppler IPA data 440 , color IPA data 445 , probe geometry 450 , image zones data 455 , doppler zones data 460 , color zones data 465 , image apodization 470 , doppler apodization 475 , and color apodization 480 . These are preferably encoded as dialog boxes. Through the model-properties dialog box 425 , a user can enter general settings for the probe model.
  • IPA image Integrated Pulse Amplitude
  • FIG. 14A illustrates a dialog box for entering a viewing probe model properties. Entered parameters are downloaded to the ultrasound probe.
  • FIG. 14B illustrates a dialog box for entering and viewing safety information 430 .
  • a user can enter general settings 432 and beam width table data 434 per governing standards.
  • FIG. 14C illustrates a dialog box for entering and viewing image IPA data 435 .
  • the dialog box displays beamformed output values, listed in volts as a function of image display zones for various drive voltages. Similar dialog boxes are used to enter the doppler and color IPA data 440 , 445 .
  • FIG. 14D illustrates a dialog box for effecting the image apodization function 470 .
  • the operator can enter and view general settings 472 and vector information 474 .
  • the user can select active elements for array windowing (or apodization).
  • the probe specific property display 500 includes dialog boxes for entering probe specifics 510 , image Field Of-View (FOV) data 520 , doppler FOV data 530 , and color FOV data 540 .
  • FOV Field Of-View
  • the probe specifics dialog box 510 the user can enter general settings 512 , imaging static information 514 , doppler static information 516 , and FOV settings 518 .
  • FIG. 15A illustrates a dialog box for entering, and viewing probe specific information. Any number of probes can be supported.
  • FIG. 15B- 15 C illustrate dialog boxes for entering image FOV data 520 .
  • a user can enter general settings 522 , breakpoint PGC data 524 , zone boundaries 526 , and zone duration 528 data.
  • Dialog boxes for the doppler and color FOV data displays 530 , 540 are similar and are all the entry of general settings 532 , 542 , breakpoint TGC data 534 , 544 , and PRF data 536 , 546 .
  • FIGS. 15 D- 15 J illustrate additional windows and control panels for controlling an ultrasound imaging system in accordance with the invention.
  • FIG. 15D shows a viewing window for the region of interest and a control panel situated side by side with the scan image.
  • FIG. 15E shows controls for the doppler field of view and other selectable settings.
  • FIG. 15F shows the color field of view controls.
  • FIG. 15G shows properties of the probe.
  • FIG. 15 h shows the color IPA data for a probe.
  • FIG. 151 shows the probe geometry settings for a linear array.
  • FIG. 15J shows settings for doppler apodization.
  • FIG. 16 illustrates the zoom feature of a preferred embodiment of the imaging system in accordance with the invention.
  • detailed features of a phantom, or internal anatomical features 600 of a patient that are shown on screen 32 can be selected and enlarged within or over a display window.
  • a region 602 is selected by the user and is enlarged at window 604 .
  • a plurality of such regions can be simultaneously enlarged and shown on screen 32 in separate or overlying windows. If two scan heads are in use, different views can be shown at the same time, or previously recorded images can be recalled from memory and displayed beside an image presented in real time.
  • the architecture of the integrated front-end probe approach was designed to provide small size, low power consumption and maximal flexibility in scanning, including: 1) multi-zone focus on transmission; 2) ability to drive a variety of probes, such as linear/curved linear, linear/trapezoidal, and sector scan; 3) ability to provide M-mode, B-mode, Color Flow Map and Doppler Sonogram displays; 4) multiple, selectable pulse shapes and frequencies; and 5) different firing sequences.
  • Different embodiments for the integrated front-end system 700 are shown in FIGS. 17A, 17B and 17 C.
  • Modules unique to this invention are the blocks corresponding to: beamforming chip 702 , transmit/receive chip 704 , preamplifier/TGC chip 706 .
  • the block labeled “front-end probe” (front-end controller) directly controls the routine operation of the ultrasound scan head by generating clock and control signals provided to modules 702 , 704 , 706 and to the memory unit 708 . These signals are used to assure continuous data output and to indicate the module for which the data appearing at the memory-unit output are intended.
  • Higher level control of the scan head 710 as well as initialization, data processing and display functions, are provided by a general purpose host computer 720 , such as a desktop PC, laptop or palmtop.
  • the front-end controller also interfaces with the host computer, e.g. via PCI bus or Fire Wire 714 to allow the host to write control data into the scanhead memory unit and receive data back.
  • the front-end controller also provides buffering and flow-control functions, as data from the beamformer must be sent to the host via a bandwidth-constrained link, to prevent data loss.
  • FIG. 17A shows a hardware-based 722 implementation, in which a dedicated Doppler-processing chip is mounted on a back-end card 724 and used as a co-processor to the host computer 720 to accomplish the CFM and DS computations.
  • FIG. 17B shows a software implementation in which the CFM and DS computations are performed by the host computer.
  • FIG. 17C shows yet another system integration, in which the transducer array and the front-end processing units are not integrated into a single housing but are connected by coaxial cables.
  • the front-end units include the front-end controller, the memory and the three modules 704 (transmit/receive chip), 706 (preamp/TGC chip) and 702 (the beamforming chip) as shown in the Figure.
  • FireWire refers to IEEE standard 1394, which provides high-speed data transmission over a serial link. This allows use of high-volume, low cost commercial parts for the interface.
  • the standard supports an asynchronous data transfer mode that can be used to send commands and configuration data to the probe head memory. It can also be used to query the status of the head and obtain additional information, such as the activation of any buttons or other input devices on the head. Additionally, the asynchronous data transfer mode can be used to detect the type of probe head attached.
  • An isochronous transfer mode can be used to transfer data back from the beamformer to the host. These data may come directly from the A/D or from the demodulator or some combination.
  • the Doppler processed data can be sent via FireWire.
  • the data can be Doppler processed via software or hardware in the host.
  • FireWire There also exists a wireless version of the FireWire standard, allowing communication via an optical link for untethered operation. This can be used to provide greater freedom when the probe head is attached to the host using wireless FireWire.
  • the preamp/TGC chip as implemented consists of integrated 32 parallel, low-noise, low-power, amplifier/TGC units. Each unit has 60-dB programmable gain, a noise voltage less than 1.5nV1 ⁇ square root ⁇ square root over (Hz) ⁇ and dissipates less than 11 mW per receiver channel.
  • the multi-channel transmit/receive chip consists of a global counter, a global memory and a bank of parallel dual-channel transmit/receiver controllers.
  • each controller 740 there are local memory 745 , delay comparator, frequency counter & comparator, pulse counter & Comparator, phase selector, transmit/receive select/demux switch (T/R switch), and level shifter units.
  • the global counter 742 broadcasts a master clock and bit values to each channel processor 740 .
  • the global memory 744 controls transmit frequency, pulse number, pulse sequence and transmit/receive select.
  • the local delay comparator 746 provides delay selection for each channel. For example, with a 60 MHZ clock, and a 10-bit global counter, a delay of up to 17 ⁇ s can be provided for each channel.
  • the local frequency counter 748 provides programmable transmit frequency.
  • a 4-bit counter with a comparator provides up to sixteen different frequency selections.
  • the local pulse counter 750 provides different pulse sequences.
  • a 6-bit counter with a comparator can provide programmable transmitted pulse lengths from one pulse up to 64 pulses.
  • the locally programmable phase selector which provides sub-clock delay resolution.
  • programmable subclock delay resolution allows the delay resolution to be more precise than the clock period.
  • the output of the frequency counter is gated with a phase of the clock that is programmable on a per-channel basis.
  • a two-phase clock is used and the output of the frequency counter is either gated with the asserted or deasserted clock.
  • multiple skewed clocks can be used.
  • One per channel can be selected and used to gate the coarse timing signal from the frequency counter. For example, for a 60-MHz master clock, a two-to-one phase selector provides 8-ns delay resolution and a four-to-one phase selector provides 4-ns delay resolution.
  • the integrated transmit/receiver select switch 754 T/R switch and the integrated high-voltage level shifter 750 for the transmit pulses.
  • a single-chip transmit/receive chip capable of handling 64 channel drivers and 32-channel receivers can be used, each channel having a controller as shown in FIG. 18.
  • the T/R select/mux switch and the high-voltage level shifter are separated from the other components 760 on a separate chip 762 to allow use of different high-voltage semiconductor technologies, such as high-breakdown silicon CMOS/JFET or GaAs technology for production of these components.
  • the basic method for pulsed-Doppler ultrasound imaging is illustrated in FIG. 20.
  • the waveform consists of a burst of N pulses 770 . After each pulse as many range (depth) samples as needed are collected.
  • the time evolution of the velocity distribution of material within the range gate is displayed as a sonogram 772 , a two-dimensional display in which the horizontal axis represents time and the vertical axis velocity (as assessed by Doppler shift). Different regions can be interrogated by moving the range gate and varying its size.
  • a Doppler sonogram can be generated using single-range-gate Doppler processing, as shown in FIG. 20. The operation of this method is as follows.
  • a sequence of N ultrasonic pulses is transmitted at a pulse repetition frequency f prf along a given viewing angle.
  • the return echoes are range gated and only returns 774 from a single range bin are used, meaning that only the returned signals corresponding to a region at a selected distance (e.g. from depth d to d+ ⁇ d) from the transducer array along the selected viewing angle are processed to extract Doppler information.
  • the velocity profiles of scatterers in the selected region can be obtained by computing the Doppler shifts of the echoes received from the scatterers. That is, Fourier transformation 776 of the received time-domain signal provides frequency information, including the desired Doppler, f d .
  • c is the speed of sound in the transmitting medium and f c , is the center frequency of the transducer.
  • f c is the center frequency of the transducer.
  • CFM color flow mapping
  • velocities are estimated not only along a single direction or line segment, but over a number of directions (multiple scan lines) spanning a region of interest.
  • the velocity information is typically color-coded (e.g. red indicates flow toward the transducer, blue away) and superimposed over a B-mode image that displays the underlaying anatomy.
  • a color-flow map 780 based on pulsed-Doppler processing is shown in FIG. 21.
  • the basic single-range bin system of FIG. 20 can be extended to measure a number of range gates by sampling at different depths and retaining the samples in storage for additional processing. Note that this does not increase the acquisition time, as data are collected from the same RF line. Sweeping the beam over an area then makes it possible to assemble an image of the velocities in a 20 region of interest.
  • the data from J range bins 782 along a single direction are processed in parallel. After N pulse returns are processed, the outputs represent a J ⁇ N range-vs-Doppler distribution, which in turn can be used to generate a J ⁇ N velocity distribution profile.
  • the range gate size and position can be determined by the user. This choice determines both the emitted pulse length and pulse repetition frequency.
  • the size of the range gate is determined by the length of the pulse.
  • the pulse duration is
  • the gate length is l g , and M is the number of sine periods.
  • the depth of the gate determines how quickly pulse echo lines can be acquired.
  • the maximum rate is
  • FIG. 22 The generic waveform for the pulse-Doppler ultrasound imaging is shown in FIG. 22 where the waveform consists of a burst of N pulses 800 . As many as range depth samples as needed are collected following each pulse in the burst.
  • FIG. 22 also shows a block diagram 810 of a conventional signal processor for this imaging technique, where the returned echoes received by each transducer are sampled and coherently summed prior to in-phase and quadrature demodulation. The down converted/basebanded returns are converted to a digital representation, and then stored in a buffer memory until all the pulse returns comprising a coherent interval are received.
  • the N pulse returns collected for each depth are then read from memory, a weighting sequence, v(n), is applied to control Doppler sidelobes, and an N-point FFT is computed.
  • v(n) a weighting sequence
  • an N-point FFT is computed.
  • the FFT 818 output is passed on to a display unit or by time averaging Doppler samples for subsequent display.
  • the CDP device described here performs all of the functions indicated in the dotted box of FIG. 22, except for A/D conversion, which is not necessary because the CDP device provides the analog sampled data function.
  • This CDP Pulsed-Doppler Processor (PDP) device has the capability to compute a matrix-matrix product, and therefore has a much broader range of capabilities than needed to implement the functions shown within the dotted lines.
  • the PDP device computes the product of two real-valued matrices by summing the outer products formed by pairing columns of the first matrix with corresponding rows of the second matrix.
  • the Doppler filtering equation into a sum of real-valued matrix operations.
  • the Doppler filtering is accomplished by computing a Discrete Fourier Transform (DFT) of the weighted pulse returns for each depth of interest.
  • DFT Discrete Fourier Transform
  • the weighting function can be combined with the DFT kernel to obtain a matrix of Doppler filter transform coefficients with elements given by
  • the double-indexed variables may all be viewed as matrix indices. Therefore, in matrix representation, the Doppler filtering can be expressed as matrix product operation. It can be seen that the PDP device can be used to perform each of the four matrix multiplications, thereby implementing the Doppler filtering operation.
  • FIG. 22 A block diagram of the PDP device described in this invention is shown in FIG. 22.
  • the device includes a J-stage CCD tapped delay line, J CCD multiplying D/A converters (MDACs) J ⁇ K accumulators, a J ⁇ K Doppler sample buffer, and a parallel-in-serial out (PISO) output shift register.
  • the MDACs share a common 8-bit digital input on which elements from the coefficient matrix are supplied.
  • the tapped delay line performs the function of a sample-and hold, converting the continuous-time analog input signal to a sampled analog signal.
  • a two-PDP implementation 840 for color flow mapping in a ultrasound imaging system is shown in FIG. 23.
  • the top PDP component computes all the terms of the form W k ,f r and W i f r as shown in the above, while the bottom component computes the terms of the form ⁇ W i f i , and W k f i .
  • the outputs of each component are then summed to alternately obtain g r and g i .
  • Doppler and color flow map processing involves a significant amount of computation. This processing may be accomplished in software using a general-purpose microprocessor. The presence of instructions optimized for matrix-matrix operations, such as the Intel MMX feature set, can substantially improve performance.
  • a software flow chart for color-flow map computation based on the FFT computation algorithm is shown in FIG. 24. After initialization 900 , the downconverted data is obtained 902 and the pointer P is at the beginning of the scan line 904 , the data is averaged and stored 906 , a weighting function is applied 908 , the FFT is computed 910 , the magnitude z(k) is computed for each frequency 912 followed by the computation of first and second moments 914 and display thereof in color 916 . The painter is incremented 918 and each scan line is processed as needed.
  • a software flow chart for color-flow map computation based on the cross-correlation computation is showing in FIG. 25.
  • the scan line data is obtained 942 , followed by the range bin data 944 .
  • the cross correlation is computed 946 and averaged 948 , and the velocity distribution 950 , first and second moments 952 are obtained and displayed 954 .
  • the range bin data is increased 956 and the process repeated.

Abstract

An ultrasound imaging system includes a scan conversion process for converting ultrasound data into a standard display format conversion and can be performed on a personal computer by programming the computer to convert data from polar coordinates to cartesian coordinates suitable for display on a computer monitor. The data is provided from scan head enclosure that houses an array of ultrasonic transducers and the circuitry associated therewith, including pulse synchronizer circuitry used in the transmit mode for transmission of ultrasonic pulses and beam forming circuitry used in the receive mode to dynamically focus reflected ultrasonic signals returning from the region of interest being imaged.

Description

    CROSS REFERENCES TO RELATED APPLICATIONS
  • This is a Continuation application of U.S. Ser. No. 09/447,144, filed on Nov. 23, 1999 which is a Continuation application of U.S. Ser. No. 09/203,877, filed on Dec. 2, 1998 which is a Continuation application of International Application No. PCT/US97/24291 filed on Dec. 23, 1997 which is a Continuation-in-part application of U.S. Ser. No. 08/773,647 filed on Dec. 24, 1996 which is a Continuation-in-part of International Application No. PCT/US96/11166, filed on Jun. 28, 1996, which is a Continuation-in-Part application of U.S. Ser. No. 08/599,816, filed on Feb. 12, 1996, which is a Continuation-in-Part of U.S. Ser. Nos. 08/496,804 and 08/496,805 both filed on Jun. 29, 1995, the entire contents of the above applications are being incorporated herein by reference.[0001]
  • BACKGROUND OF THE INVENTION
  • Conventional ultrasound imaging systems typically include a hand-held scan head coupled by a cable to a large rack-mounted console processing and display unit. The scan head typically includes an array of ultrasonic transducers which transmit ultrasonic energy into a region being imaged and receive reflected ultrasonic energy returning from the region. The transducers convert the received ultrasonic energy into low-level electrical signals which are transferred over the cable to the processing unit. The processing unit applies appropriate beam forming techniques such as dynamic focusing to combine the signals from the transducers to generate an image of the region of interest. [0002]
  • Typical conventional ultrasound systems include transducer arrays having a plurality, for example [0003] 128, of ultrasonic transducers. Each transducer is associated with its own processing circuitry located in the console processing unit. The processing circuitry typically includes driver circuits which, in the transmit mode, send precisely timed drive pulses to the transducer to initiate transmission of the ultrasonic signal. These transmit timing pulses are forwarded from the console processing unit along the cable to the scan head. In the receive mode, beam forming circuits of the processing circuitry introduce the appropriate delay into each low-level electrical signal from the transducers to dynamically focus the signals such that an accurate image can subsequently be generated.
  • For phased array or curved linear scan heads, the ultrasound signal is received and digitized in its natural polar (r,θ) form. For display, this representation is inconvenient, so it is converted into a rectangular (x,y) representation for further processing. The rectangular representation is digitally corrected for the dynamic range and brightness of various displays and hard-copy devices. The data can also be stored and retrieved for redisplay. In making the conversion between polar and rectangular coordinates, the (x,y) values must be computed from the (r,θ) values because the points on the (r,θ) array and the rectangular (x,y) grid are not coincident. [0004]
  • In prior scan conversion systems, each point on the (x,y) grid is visited and its value is computed from the values of the two nearest 6 values by linear interpolation or the four nearest neighbors on the (r,θ) array by bi-linear interpolation. This is accomplished by use of a finite state machine to generate the (x,y) traversal pattern, a bi-directional shift register to hold the (r,θ) data samples in a large number of digital logic and memory units to control the process and ensure that the correct asynchronously received samples of (r,θ) data arrive for interpolation at the right time for each (x,y) point. This prior implementation can be both inflexible and unnecessarily complex. Despite the extensive control hardware, only a single path through the (x,y) array is possible. [0005]
  • SUMMARY OF THE INVENTION
  • In a preferred embodiment of the invention, scan data is directed into a computer after beamforming and scan conversion is performed to convert the scan data into a display format. In a preferred embodiment, scan conversion can be performed entirely using a software module on a personal computer. Alternatively a board with additional hardware can be inserted to provide selected scan conversion functions or to perform the entire scan conversion process. For many applications, the software system is preferred as additional hardware is minimized so the personal computer can be a small portable platform, such as a laptop or palmtop computer. [0006]
  • Scan conversion is preferably performed using a spatial dithering process described in greater detail below. Spatial dithering simplifies the computational requirements for scan conversion while retaining image resolution and quality. Thus, scan conversion can be performed on a personal computer without the need for more complex interpolation techniques and still provide conversion at frame rates suitable for real time ultrasound imaging. [0007]
  • Preferably, the scan conversion procedure includes an input array, a remap array, and an output array. The remap array is an array of indices or pointers, which is the size of the output image used to determine where to get each pixel from the input array. The numbers in each position in the remap array indicate where in the input data to take each pixel will go into the output array in the same position. Thus, the remap array and output array can be thought of as having the same geometry while the input array and output array have the same type of data, i.e., actual image data. [0008]
  • The input array has new data for each ultrasound frame, which means that it processes the data and puts the data in the output array on every frame. In accordance with a preferred embodiment of the invention, there is a new ultrasound frame approximately every 1130 second. Consequently, the remap array data can be generated relatively slowly (but still well under about one second) as long as the routine operation of computing a new output image from a new input data set is performed at the frame rate of approximately 30 frames per second. This allows a general purpose personal computer to perform the task of generating the data for the remap array without compromising performance, but also without having to dedicate additional hardware to the task. In a computing system having a digital signal processor (DSP), the DSP can perform the computations of the remap array. [0009]
  • Alternatively, certain scan conversion functions can be performed by hardware inserted into the personal computer on a circuit board. This board or a card can be inserted and used as an interface to deliver data in the proper form to the PC bus controller.[0010]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. [0011]
  • FIG. 1 is a block diagram of a conventional imaging array as used in an ultrasound imaging system. [0012]
  • FIG. 2A is a schematic illustration of the relationship between a linear ultrasound transducer array and a rectangular scan region in accordance with the present invention. [0013]
  • FIG. 2B is a schematic illustration of the relationship between a curved linear ultrasound transducer array and a curved scan region in accordance with the present invention. [0014]
  • FIG. 2C is a schematic illustration of the relationship between a linear ultrasound transducer array and a trapezoidal scan region in accordance with the present invention. [0015]
  • FIG. 2D is a schematic illustration of a phased array scan region. [0016]
  • FIG. 3 is a schematic pictorial view of a preferred embodiment of the ultrasound imaging system of the present invention. [0017]
  • FIG. 4A is a schematic functional block diagram of a preferred embodiment of the ultrasound imaging system of the invention. [0018]
  • FIG. 4B is a schematic functional block diagram of an alternative preferred embodiment of the ultrasound imaging system of the invention. [0019]
  • FIG. 5A is a schematic diagram of a beamforming and filtering circuit in accordance with the invention. [0020]
  • FIG. 5B is a schematic diagram of another preferred embodiment of a beamforming and filtering circuit in accordance with the invention. [0021]
  • FIG. 5C is a schematic diagram of another preferred embodiment of a beamforming and filtering circuit in accordance with the invention. [0022]
  • FIG. 5D is a schematic diagram of a low pass filter in accordance with the invention. [0023]
  • FIG. 5E is an example of an interface circuit board in accordance with the invention. [0024]
  • FIG. 5F is a preferred embodiment of an integrated beamforming circuit in accordance with the inventions. [0025]
  • FIG. 6 is a graphical illustration of the passband of a filter in accordance with the invention. [0026]
  • FIG. 7A is a schematic diagram of input points overlayed on a display. [0027]
  • FIG. 7B is a schematic diagram of a display of FIG. 6 having input data converted to pixels. [0028]
  • FIG. 8 is a schematic diagram of a preferred embodiment of a general purpose image remapping architecture. [0029]
  • FIGS. [0030] 9A-9B are a flow chart illustrating a remap array computation technique in accordance with the invention.
  • FIG. 10 is a flow chart of an output frame computation engine. [0031]
  • FIGS. [0032] 11A-11B are schematic pictorial views of two user-selectable display presentation formats used in the ultrasound imaging system of the invention.
  • FIGS. [0033] 12A-12B are functional block diagrams of a preferred graphical user interface.
  • FIG. 13 illustrates a dialog box for ultrasound image control. [0034]
  • FIGS. [0035] 14A-14D illustrate display boxes for entering system information.
  • FIGS. [0036] 15A-15C illustrates additional dialog boxes for entering probe or FOV data.
  • FIGS. [0037] 15D-15J illustrate additional display and dialog boxes for a preferred embodiment of the invention.
  • FIG. 16 illustrates imaging and display operations of a preferred embodiment of the invention. [0038]
  • FIGS. [0039] 17A-17C illustrate preferred embodiments of integrated probe systems in accordance with the invention.
  • FIG. 18 illustrates a 64 channel integrated controller of a transmit/receive circuit for an ultrasound system. [0040]
  • FIG. 19 illustrates another preferred embodiment of a transmit and receive circuit. [0041]
  • FIG. 20 illustrates a Doppler Sonogram system in accordance with the invention. [0042]
  • FIG. 21 illustrates a color flow map based on a fast fourier transform pulsed Doppler processing system in accordance with the invention. [0043]
  • FIG. 22 illustrates a processing system a waveform generation in accordance with the invention. [0044]
  • FIG. 23 is a system for generating a color flow map in accordance with the invention. [0045]
  • FIG. 24 is a process flow sequence for computing a color flow map in accordance with the invention. [0046]
  • FIG. 25 is a process flow sequence for generating a color flow map using cross correlation method.[0047]
  • DETAILED DESCRIPTION OF THE INVENTION
  • A schematic block diagram of an [0048] imaging array 18 of N piezoelectric ultrasonic transducers 18(1)-18(N) as used in an ultrasound imaging system is shown in FIG. 1. The array of piezoelectric transducer elements 18(1)-18(N) generate acoustic pulses which propagate into the image target (typically a region of human tissue) or transmitting media with a narrow beam 180. The pulses propagate as a spherical wave 185 with a roughly constant velocity. Acoustic echoes in the form of returning signals from image points Ip or reflectors are detected by the same array 18 of transducer elements, or another receiving array and can be displayed in a fashion to indicate the location of the reflecting structure.
  • The acoustic echo from the image point I[0049] p in the transmitting media reaches each transducer element 18(1)-18(N) of the receiving array after various propagation times. The propagation time for each transducer element is different and depends on the distance between each transducer element and the image point Ip. This holds true for typical ultrasound transmitting media, i.e. soft bodily tissue, where the velocity of sound is at least relatively constant. Thereafter, the received information is displayed in a manner to indicate the location of the reflecting structure.
  • In two-dimensional B-mode scanning, the pulses can be transmitted along a number of lines-of-sight as shown in FIG. 1. If the echoes are sampled and their amplitudes are coded as brightness, a grey scale image can be displayed on a cathode ray tube (CRT) or monitor. An image typically contains 128 such scanned lines at 0.75° angular spacing, forming a 90° sector image. Because the velocity of sound in water is 1.54×10[0050] 5 cm/sec, the round-trip time to a depth of 16 cm will be 208 μs. Thus, the total time required to acquire data along 128 lines of sight (for one image) is 26.6 ms. If other signal processors in the system are fast enough to keep up with this data acquisition rate, two-dimensional images can be produced at rates corresponding to standard television video. For example, if the ultrasound imager is used to view reflected or back scattered sound waves through the chest wall between a pair of ribs, the heart pumping can be imaged in real time.
  • The ultrasonic transmitter is typically a linear array of piezoelectric transducers [0051] 18(1)-18(N) (typically spaced half-wavelength apart) for steered arrays whose elevation pattern is fixed and whose azimuth pattern is controlled primarily by delay steering. The radiating (azimuth) beam pattern of a conventional array is controlled primarily by applying delayed transmitting pulses to each transducer element 18(1)-18(N) in such a manner that the energy from all the transmitters summed together at the image point Ip produces a desired beam shape. Therefore, a time delay circuit is needed in association with each transducer element 18(1)-18(N) for producing the desired transmitted radiation pattern along the predetermined direction.
  • As previously described, the [0052] same array 18 of transducer elements 18(1)-18(N) can be used for receiving the return signals. The reflected or echoed beam energy waveform originating at the image point reaches each transducer element after a time delay equal to the distance from the image point to the transducer element divided by the assumed constant speed of the propagation of waves in the media. Similar to the transmitting mode, this time delay is different for each transducer element. At each receiving transducer element, these differences in path length should be compensated for by focusing the reflected energy at each receiver from the particular image point for any given depth. The delay at each receiving element is a function of the distance measured from the element to the center of the array and the viewing angular direction measured normal to the array.
  • The beam forming and focusing operations involve forming a sum of the scattered waveforms as observed by all the transducers, but in this sum, the waveforms must be differentially delayed so they will all arrive in phase and properly weighted in the summation. Hence, a beam forming circuit is required which can apply a different delay on each channel, and vary that delay with time. Along a given direction, as echoes return from deeper tissue, the receiving array varies its focus continually with depth. This process is known as dynamic focusing. [0053]
  • After the received beam is formed, it is digitized in a conventional manner. The digital representation of each received pulse is a time sequence corresponding to a back-scattering cross section of ultrasonic energy returning from a field point as a function of range at the azimuth formed by the beam. Successive pulses are pointed in different directions, covering a field of view from −45° to +45°. In some systems, time averaging of data from successive observations of the same point (referred to as persistence weighting) is used to improve image quality. [0054]
  • FIGS. [0055] 2A-2D are schematic diagrams illustrating the relationship between the various transducer array configurations used in the present invention and their corresponding scan image regions. FIG. 2A shows a linear array 18A which produces a rectangular scanning image region 180A. Such an array typically includes 128 transducers.
  • FIG. 2B is a schematic diagram showing the relationship between a curved [0056] linear transducer array 18B and the resulting sectional curved image scan region 180B. Once again, the array 18B typically includes 128 adjacent transducers.
  • FIG. 2C shows the relationship between a [0057] linear transducer array 18C and a trapezoidal image region 180C. In this embodiment, the array 18C is typically formed from 192 adjacent transducers, instead of 128. The linear array is used to produce the trapezoidal scan region 180C by combining linear scanning as shown in FIG. 2A with phased array scanning. In one embodiment, the 64 transducers on opposite ends of the array 18C are used in a phased array configuration to achieve the curved angular portions of the region 180 C at its ends. The middle 64 transducers are used in the linear scanning mode to complete the rectangular portion of the region 180C. Thus, the trapezoidal region 180C is achieved using a sub-aperture scanning approach in which only 64 transducers are active at any one time. In one embodiment, adjacent groups of 64 transducers are activated alternately. That is, first, transducers 1-64 become active. Next, transducers 64-128 become active. In the next step, transducers 2-65 are activated, and then transducers 65-129 are activated. This pattern continues until transducers 128-192 are activated. Next, the scanning process begins over again at transducers 1-64.
  • FIG. 2D shows a short linear array of [0058] transducers 18D used to perform phased array imaging in accordance with the invention. The linear array 18D is used via phased array beam steering processing to produce an angular slice region 180D.
  • FIG. 3 is a schematic pictorial view of an [0059] ultrasound imaging system 10 of the present invention. The system includes a hand-held scan head 12 coupled to a portable data processing and display unit 14 which can be a laptop computer. Alternatively, the data processing and display unit 14 can include a personal computer or other computer interfaced to a CRT for providing display of ultrasound images. The data processor display unit 14 can also be a small, lightweight, single-piece unit small enough to be hand-held or worn or carried by the user. Although FIG. 3 shows an external scan head, the scan head of the invention can also be an internal scan head adapted to be inserted through a lumen into the body for internal imaging. For example, the head can be a transesophogeal probe used for cardiac imaging.
  • The [0060] scan head 12 is connected to the data processor 14 by a cable 16. In an alternative embodiment, the system 10 includes an interface unit 13 (shown in phantom) coupled between the scan head 12 and the data processing and display unit 14. The interface unit 13 preferably contains controller and processing circuitry including a digital signal processor (DSP). The interface unit 13 can perform required signal processing tasks and can provide signal outputs to the data processing unit 14 and/or scan head 12. For user with a palmtop computer, the interface unit 13 is preferably an internal card or chip set. When used with a desktop or laptop computer, the interface unit 13 can instead be an external device.
  • The hand-held [0061] housing 12 includes a transducer section 15A and a handle section 15B. The transducer section 15A is maintained at a temperature below 41° C. so that the portion of the housing that is in contact with the skin of the patient does not exceed this temperature. The handle section 15B does not exceed a second higher temperature preferably 50° C.
  • FIG. 4A is a schematic functional block diagram of one embodiment of the [0062] ultrasound imaging system 10 of the invention. As shown, the scan head 12 includes an ultrasonic transducer array 18 which transmits ultrasonic signals into a region of interest or image target 11, such as a region of human tissue, and receives reflected ultrasonic signals returning from the image target. The scan head 12 also includes transducer driver circuitry 20 and pulse synchronization circuitry 22. The pulse synchronizer 22 forwards a series of precisely timed and delayed pulses to high voltage driver circuits in the drivers 20. As each pulse is received by the drivers 20, the high-voltage driver circuits are activated to forward a high-voltage drive signal to each transducer in the transducer array 18 to activate the transducer to transmit an ultrasonic signal into the image target 11.
  • Ultrasonic echoes reflected by the [0063] image target 11 are detected by the ultrasonic transducers in the array 18. Each transducer converts the received ultrasonic signal into a representative electrical signal which is forwarded to preamplification circuits 24 and time-varying gain control (TGC) circuitry 25. The preamp circuitry 24 sets the level of the electrical signals from the transducer array 18 at a level suitable for subsequent processing, and the TGC circuitry 25 is used to compensate for attenuation of the sound pulse as it penetrates through human tissue and also drives the beam forming circuits 26 (described below) to produce a line image. The conditioned electrical signals are forwarded to the beam forming circuitry 26 which introduces appropriate differential delay into each of the received signals to dynamically focus the signals such that an accurate image can be created. Further details of the beam forming circuitry 26 and the delay circuits used to introduce differential delay into received signals and the pulses generated by the pulse synchronizer 22 are described in the incorporated International Application PCT/US96/11166.
  • In one preferred embodiment, the dynamically focused and summed signal is forwarded to an A/[0064] D converter 27 which digitizes the summed signal. Digital signal data is then forwarded from the A/D 27 over the cable 16 to a color doppler processing circuit 36. It should be noted that the A/D converter 27 is not used in an alternative embodiment in which the analog summed signal is sent directly over the system cable 16. The digital signal is also demodulated in a demodulation circuit 28 and forwarded to a scan conversion circuit 37 in the data processor and display unit 14.
  • As also shown a [0065] scan head memory 29 stores data from a controller 21 and the data processing and display unit 14. The scan head memory 29 provides stored data to the pulse synchronize 22, the TGC 25 and the beam former 26.
  • The [0066] scan conversion circuitry 37 converts the digitized signal data from the beam forming circuitry 26 from polar coordinates (r,θ) to rectangular coordinates (x,y). After the conversion, the rectangular coordinate data can be forwarded to an optional post signal processing stage 30 where it is formatted for display on the display 32 or for compression in a video compression circuit 34. The post processing 30 can also be performed using the scan conversion software described hereinafter.
  • Digital signal data from the A/[0067] D connector 27 is received by a pulsed or continuous Doppler processor 36 in the data processor unit 14. The pulsed or continuous Doppler processor 36 generates data used to image moving target tissue 11 such as flowing blood. In a preferred embodiment, with pulsed Doppler processing, a color flow map is generated. The pulsed Doppler processor 36 forwards its processed data to the scan conversion circuitry 28 where the polar coordinates of the data are translated to rectangular coordinates suitable for display or video compression.
  • A control circuit, preferably in the form of a [0068] microprocessor 38 inside of a personal computer (e.g., desktop, laptop, palmtop), controls the high-level operation of the ultrasound imaging system 10. The microprocessor 38 or a DSP initializes delay and scan conversion memory. The control circuit 38 controls the differential delays introduced in both the pulsed synchronizer 22 and the beam forming circuitry 26 via the scan head memory 27.
  • The [0069] microprocessor 38 also controls a memory 40 which stores data used by the scan conversion circuitry 28. It will be understood that the memory 40 can be a single memory or can be multiple memory circuits. The microprocessor 38 also interfaces with the post signal processing circuitry 30 and the video compression circuitry 34 to control their individual functions. The video compression circuitry 34 compresses data to permit transmission of the image data to remote stations for display and analysis via a transmission channel. The transmission channel can be a modem or wireless cellular communication channel or other known communication method.
  • The portable [0070] ultrasound imaging system 10 of the invention can preferably be powered by a battery 44. The raw battery voltage out of the battery 44 drives a regulated power supply 46 which provides regulated power to all of the subsystems in the imaging system 10 including those subsystems located in the scan head 12. Thus, power to the scan head can be provided from the data processing and display unit 14 over the cable 16.
  • FIG. 4B is a schematic functional block diagram of an alternative preferred embodiment of the ultrasound imaging system of the invention. In a modified [0071] scan head 12′, demodulation circuitry is replaced by software executed by the microprocessor 38 in a modified data processing and display unit 14′. In particular, the digital data stream from the A/D converter 27 is buffered by a FIFO memory 37. The microprocessor executes software instruction to demodulate, perform scan conversion, color doppler processing, post signal processing and video compression. Thus many hardware functions of FIG. 4A are replaced by software stored in memory 40 in FIG. 4B, reducing hardware size and weight requirements for the system 10′.
  • Additional preferred embodiments for beam forming circuitry of ultrasound systems are depicted in FIGS. 5A, 5B, and [0072] 5C. Each of these implementations requires that sampled-analog data be down-converted, or mixed, to a baseband frequency from an intermediate frequency (IF). The down-conversion or mixing is accomplished by first multiplying the sampled data by a complex value (represented by the complex-valued exponential input to the multiplier stage), and then filtering the data to reject images that have been mixed to nearby frequencies. The outputs of this processing are available at a minimum output sample rate and are available for subsequent display or Doppler processing.
  • In FIG. 5A, a set of [0073] sampling circuits 56 is used to capture a data 54 represented by a packets of charge in a CCD-based processing circuit fabricated on an integrated circuit 50. Data are placed in one or more delay lines and output, at appropriate times using memory and control circuitry 62, programmable delay circuits 58, to an optional interpolation filter 60. The interpolation filter can be used to provide refined estimates of the round-trip time of a sound wave and thereby provide better focus of the returned signals from an array of sensors. In FIG. 5A, two processing channels 52, of an array of processors, are depicted. The outputs from the interpolation filters are combined, at an analog summing junction 66, to provide a datum of beamformed output from the array.
  • Data obtained using an ultrasound transducer resembles the output of a modest-bandwidth signal modulated by the center frequency of the transducer. The center frequency, or characteristic frequency, of the transducer is equivalent to the IF. In a sample-analog system (e.g., using CCDs), Ω=2πf[0074] I/fs, where fi is the intermediate frequency and fI is the sampling frequency. The value n corresponds to the sample-sequence number (i.e., n=0,1,2,3,4, . . . ). The outputs of the multiplier 68 are termed, in-phase (I) or quadrature (Q) samples. In general, both I and Q values will be non zero. When the IF is chosen to equal the fs/4, however, the multiplier output will only produce either I or Q values in a repeating sequence, I, Q, −I, −Q, I, Q, −I . . . In fact, the input data are only scaled by 1 and −1. Thus, if the input data, a, are sequentially sampled at times, a[0], a[1], a[2], a[3], a[4], . . . , a[n], the output data are a[0], j*a[1], −a[2]. −j*a[3], a[4], . . . , a[n], the output data are a[0], j*a[1], −a[2], −j*[3], a [4], . . .
  • The I and Q outputs [0075] 74, 76 are each low-pass filtered 70, 72 to reject signal images that are mixed into the baseband. The coefficients of the low-pass filters can be designed using a least-mean square (LMS or L2-norm) or Chebyshev (L-infinity norm) criteria. In practice, it is desirable to reduce the number of coefficients necessary to obtain a desired filter characteristic as much as possible.
  • An example of a CCD implementation of a low-pass filter is illustrated in FIG. 5D. The [0076] device 90 consists of a 13-state tapped delay line with five fixed-weight multipliers 94 to implement the filter coefficients. As can be seen in the illustration of FIG. 6, the ripple in the passband is under 0.5 dB and the stopband attenuation is less than −30 dB of full scale.
  • The output of the low-pass filters are then decimated [0077] 78 by at least a factor of 2. Decimation greater than 2 may be warranted if the bandwidth of the ultrasound signal is bandlimited to significantly less than half the sampling frequency. For most ultrasound signals, a decimation factor greater than 2 is often used because the signal bandwidth is relatively narrow relative to the sampling frequency.
  • The order of the decimation and the low-pass filters may be interchanged to reduce the clocking frequency of the low-pass filters. By using a filter bank, the coefficients for the I and Q low-pass filters can be chosen such that each filter only accepts every other datum at its input. This “alternating clock” scheme permits the layout constraints to be relaxed when a decimation rate of 2 is chosen. These constraints can be further relaxed if the decimation factor is greater than 2 (i.e., when the signal bandwidth ≦≦f/2). [0078]
  • The down-converted output data are passed on for further processing that may include signal-envelope detection or Doppler processing. For display, the signal envelope (also referred to as the signal magnitude) is computed as the square root of the sum of the squares of the I and Q outputs. For the case when IF=f[0079] s/4, that is either I=0 or Q=0, envelope detection becomes trivial. The I and Q data are often the inputs to Doppler processing which also uses the signal envelope to extract information in the positive- and/or negative-frequency sidebands of the signal. In FIG. 5A, only one down-conversion stage is required following the ultrasound beamforming.
  • In FIG. 5B, a down-conversion stage has been placed in each [0080] processing channel 52 following the sampling circuits 56. Here the production of I and Q data 86, 88 is performed exactly as before, however, much sooner in the system. The primary advantage of this approach is that the data rate in each processing channel can be reduced to a minimum, based on the ultrasound signal bandwidth and hence the selection of the low-pass filter and decimation factor. In this implementation, all processing channels 52 will use the same complex-value multipliers and identical coefficients and decimation factors in the filter stage. As in the preceding implementation, complex-valued data are delayed and interpolated to provide beamformed output.
  • The ultrasound front end depicted in FIG. 5C is nearly identical to that in FIG. 5B. The difference is that the [0081] interpolation stage 85, 87 has been removed and replaced by choosing unique values in the complex-valued multipliers to provide a more-precise estimate of the processing-channel delay. This approach has the disadvantage that the output of the multiplier will always exhibit I and Q values that are non zero. This is a consequence of the varying sampling rate around the unit circle, in a complex-plane diagram, of the multiplier input. Thus, this approach can provide a more precise estimate of the sample delay in each channel, but at the expense of producing fully complex-valued data at the output of each processing channel. This modification may require more post-processing for envelope and Doppler detection than that presented in the previous implementations.
  • A preferred embodiment of a system used to interface between the output of the beamforning or filtering circuit and the computer is to provide a plug in board or card (PCMCIA) for the computer. [0082]
  • The [0083] board 700 of FIG. 5E illustrates an embodiment in which 16 bits of digital beamformed data are received over the cable from the scanhead by differential receivers 702. A clock signal is also received at registers 704 along with converted differential data. The first gate array 708 converts the 16 bits to 32 bits at half the data rate. The 32 bit data is clocked into the FIFO 712 which outputs add-on data 716. The second gate array 710 has access to all control signals and outputs 714 to the PCI bus controller. This particular example utilizes 16 bits of data, however, this design can also be adapted for 32 bits or more.
  • Alternatively, a card suitable for insertion in a slot or port of a personal computer, laptop or palmtop computer can also be used. In this embodiment the differential receivers input to registers, which deliver data to the FIFO and then to a bus controller that is located on the card. The output from the controller is connected directly to the PCI bus of the computer. An alternative to the use of differential drivers and receivers to interconnect the scan head to the interface board or card is to utilize the IEEE 1394 standard cable also known as “firewire.”[0084]
  • An example of a preferred embodiment of an [0085] integrated beamforming circuit 740 is illustrated in FIG. 5F. The circuit 740 includes a timing circuit 742, and 5 delay circuits 760 attached to each side of summing circuit 754. Each circuit 760 includes a sampling circuit 746, a CCD delay line 752, a control and memory circuit 750, a decoder 748, and a clocking driver circuit 744. The circuitry is surrounded by contact pads 756 to provide access to the chip circuitry. The integrated circuit is preferably less than 20 square millimeters in area and can be mounted on a single board in the scan head as described in the various embodiments set forth in the above referenced incorporated application. A sixteen, thirty two, or sixty four delay line integrated circuit can also be implemented utilizing a similar structure.
  • FIG. 7A is a schematic diagram of input points overlayed on a display. As illustrated, input points I[0086] p received from the ultrasound beam 180 do not exactly align with the rectangular arranged pixel points P of a conventional display 32. Because the display 32 can only display pixelized data, the input points Ip must be converted to the rectangular format.
  • FIG. 7B is a schematic diagram of a display of FIG. 6 having input data converted to pixels. As illustrated, each image point I[0087] p is assigned to a respective pixel point P on the display 32 to form an image.
  • One purpose of scan conversion is to perform the coordinate space transformation required for use with scan heads that are not flat linear, such as phased array, trapezoidal or curved linear heads. To do this, data must be read in one order and output data must be written in another order. Many existing systems must generate the transformation sequences on the fly, which reduces the flexibility and makes trapezoidal scan patterns more difficult. [0088]
  • Because scan conversion is reordering the data, it can also be used to rotate, pan and zoom the data. Rotation is useful for viewing the image with the scan head depicted at the top, left, right, or bottom of the image, or an arbitrary angle. Zooming and panning are commonly used to allow various parts of the image to be examined more closely. [0089]
  • In addition to zooming into one area of the object, it is useful to be able to see multiple areas simultaneously in different regions of the screen. Often the entire image is shown on the screen but certain regions are replaced with zoomed-in-views. This feature is usually referred to as “window-in-a-window.” Current high-end systems provides this capability for one window, but it is preferred that an imaging system allow any number of zoomed regions, each of which having an arbitrary size and shape. [0090]
  • The use of irregular scan patterns can ease system design and allow greater scan head utilization. In particular, this allows reduction or hiding of dead time associated with imaging deep zones. In the case of deep zone imaging, the beam is transmitted but received at some later time after the wave has had time to travel to the maximum depth and return. More efficient use of the system, and thus a higher frame rate or greater lateral sampling, can be obtained if other zones are illuminated and reconstructed during this dead time. This can cause the scan pattern to become irregular (although fixed and explicitly computed). The flexible scan conversion described below corrects for this automatically. [0091]
  • FIG. 8 is a schematic diagram of a preferred embodiment of a general purpose image remapping architecture. In accordance with a preferred embodiment of the invention, data is preferably brought directly into the PC after beamforming and the remainder of the manipulation is performed in software. As such, additional hardware is minimized so the personal computer can be a small portable platform, such as a laptop or palmtop computer. [0092]
  • Preferably, there is an [0093] input array 142, a remap array 144 and an output array 146. The remap array 144 is an array of indices or pointers, which is the size of the output image used to determine where to get each pixel from the input array 142. The numbers in each position in the remap array 144 indicate where in the input data to take each pixel which will go into the output array 146 in the same position. Thus, the remap array 144 and output array 196 can be thought of as having the same geometry while the input array 142 and output array 146 have the same type of data, i.e., actual image data.
  • The [0094] input array 142 has new data for each ultrasound frame, which means that it processes the data and puts the data in the output array 146 on every frame. In accordance with the invention, there is a new ultrasound frame at a rate of at least 20 frames per second and preferably approximately every {fraction (1/30)} second. However, the remap array 144 is only updated when the head type or viewing parameters (i.e., zoom and pan) are updated. Consequently, the remap array 144 data can be generated relatively slowly (but still well under about one second or else it can become cumbersome) as long as the routine operation of computing a new output image from a new input data set is performed at the frame rate of approximately 30 frames per second. This allows a general purpose personal computer to perform the task of generating the data for the remap array 144 without compromising performance, but also without having to dedicate additional hardware to the task. In- a computing system having a digital signal processor (DSP), the DSP can perform the computations of the remap array 144.
  • In a preferred embodiment of the invention, input memory for the [0095] input array 142 can be either two banks of Static Random Access Memory (SRAM) or one bank of Video Random Access Memory (VRAM), where the input is serial access and the output is random access. The VRAM bank, however, may be too slow and refresh too costly. The remap memory for the remap array 144 is preferably sequential access memory embodied in VRAM, or Dynamic Random Access Memory (DRAM), although random access SRAM will also work. The output memory for the output array 146 can be either a frame buffer or a First-In First-Out (FIFO) buffer. Basically, the scan conversion is done on demand, on the fly. Scan conversion is preferably performed by software in the PC. If scan conversion is done in hardware, however, the PC is merely storing data, thus reducing system complexity. Thus, an architecture in accordance with the invention is preferably just two random access input buffers, a sequential access remap buffer and small (if any) FIFO or bit of pipelining for the output buffer. This implies the output frame buffer is in PC memory.
  • In accordance with a preferred embodiment of the invention, a spatial dithering technique employing error diffusion is used in ultrasound scan conversion. Typical dithering is done in the pixel intensity domain. In accordance with the invention, however, dithering is used in ultrasound scan conversion to approximate pixels in the spatial domain and not in the pixel intensity domain. Spatial dithering is used to approximate values that fall between two input data points. This happens because only discrete radii are sampled but pixels on the display screen can fall between two radii and need to be filtered. Spatial dithering must be used to interpolate between longitudinal sample points. [0096]
  • Recall that the [0097] remap array 144 stores the mapping of each output point to an input point. The input data points are typically in polar coordinates while the output points are in rectilinear coordinates. Although the remap array 144 merely contains indices into the input array 142, they can be considered to contain radius (r) and angle (θ) values. Ideally, these values have arbitrary precision and do not have to correspond to actual sampled points. Now consider that these arbitrary precision numbers must be converted into integer values. The integer radius values correspond to discrete samples that were taken and are limited by the radial sampling density of the system. The integer angle values correspond to discrete radial lines that were scanned and are thus limited by the number of scan angles. If spatial dithering is applied, these floating point values can be mapped into fixed integer values without having the artifacts that appear with discrete rounding without error diffusion.
  • FIGS. [0098] 9A-9B are a flow chart illustrating a remap array computation technique in accordance with the invention. At step 205, the scan heads are checked to see if there has been any change. If the scan heads have been changed, processing continues to step 210 where the new head type is configured. After step 210, or if there has been no change in the scan heads (step 205) processing continues to step 215. At step 215, the display window is checked to see if there is any zooming, panning or new window-in-window feature. If so, processing continues to step 220 where the user inputs the new viewing parameters. After step 220, or if there is no window change at step 215, processing continues to step 225 where the remap array is cleared to indicate a new relationship between the input and output arrays.
  • At [0099] step 230, the program chooses a window W to process. At step 235, all line error values LE and all sample error values SE are initialized to zero. At step 240, a point counter P is initialized to point to the top left pixel of the window W.
  • At [0100] step 245, the application computes a floating point line number LFP and sample offset SFP for each point in a view V. For a phased array, this would be a radius r and an angle θ. At step 250, any previously propagated error terms LE, SE (discussed below) are added to the floating point values LFP, SFP for the point P. At step 255, floating point terms are rounded to the nearest integer LR, SR, which correspond to actual sampled points. At step 260, the application computes rounding errors as:
  • L RE =L FP −L R;
  • S RE =S FP −S R.
  • At [0101] step 265, the errors are propagated to the pixel points to the right, below left, below, and below right relative to the current point P.
    PROPAGATE ERRORS
    LE (right) = LE (right) + LRE * 7/16
    LE (below left) = LE (below left) + LRE * 3/16
    LE (below) = LE (below) + LRE * 5/16
    LE (below right) = LE (below right) + LRE * 1/16
    SE (right) = SE (right) + SRE * 7/16
    SE (below left) = SE (below left) + S RE 3/16
    SE (below) = SE (below) + SRE * 5/16
    SE (below right) = SE (below right) + SRE * 1/16
  • At [0102] step 270, the application computes a data index based on a scan data ordered index:
  • REMAP(P)=Index(L R , S R).
  • At [0103] step 275, a check is made to see if there are more points in the window. If there are more points to be processed, the pointer P is incremented to the next point at step 280. Processing then returns to step 245. Once all points in the window have been processed, processing continues to step 285.
  • At [0104] step 285, a check is made to see if there are more windows to be processed. If so, processing returns to step 230. Otherwise, processing is done.
  • Because the dithering maps one source to each output pixel, the same remapping architecture can be used to make real-time scan conversion possible in software, even on portable computers. Thus, the complicated dithering operation is only performed during initialization or when viewing parameters are changed. However, the benefits of the dithering are present in all the images. [0105]
  • FIG. 10 is a flow chart of an output frame computation engine. At [0106] step 305, beamforming, demodulated input data is read into memory. At step 310, the output pixel index P is initialized. At step 315, the output array is set equal to the remapped input array according to the following:
  • OUTPUT(P)=INPUT(REMAP(P)).
  • At [0107] step 320, the output pixel index P is incremented. At step 325, a check is done on the pixel index P to see if the image has been formed. If not, processing returns to step 315. Once all the pixels in the image have been computed, processing continues to step 330 where the output image is optionally smoothed. Finally, at step 335, the output image is displayed.
  • Although dithering does remove the mach-banding and moire pattern artifacts which occur with simple rounding, dithering can introduce high-frequency noise. It is this high-frequency noise whose average value allow for the smooth transition effects. To the untrained eye, these artifacts are far less objectionable than those obtained with the simple rounding or nearest-point case, but may be objectionable to ultrasound technicians. [0108]
  • These artifacts can be greatly reduced or potentially eliminated by employing a low-pass spatial filter to smooth the image after the remapping process. The filter can be a box filter or non-symmetrical filters can be matched to a desired input resolution characteristic. Filters can be applied in the rectilinear domain that match the orientation or angle of point coordinates at the particular location. [0109]
  • Basically, it is desirable to have a matched filter whose extent is similar to or proportional to distances between points being dithered. Thus, a high magnification is preferably accompanied by a large filter with much smoothing, whereas in places with a spacing of the sampled radius r or angle (θ) is small (on the order of one pixel), no filtering may be required. [0110]
  • Because the remapping operation is basically two loads and a store, it can be performed using a standard personal computer. The remapping algorithm when encoded in assembly language has been shown to work on a 166 MHZ Pentium-based PC to obtain very-near real-time operation. In addition, the demodulation has been performed on the PC when written in assembly language while still achieving near real-time operation. Text and graphics labels are preferably effected by storing fixed values or colors in the beginning of the input buffer and then mapping to those places where those colors are to be used. If effect, shapes or text are drawn in the remap array, which will open and automatically be overlayed on all of the images at no computational cost. [0111]
  • FIGS. [0112] 11A-11B are schematic pictorial views of display formats which can be presented on the display 32 of the invention. Rather than displaying a single window of data as is done in prior ultrasound imaging systems, the system of the present invention has multiple window display formats which can be selected by the user. FIG. 11A shows a selectable multi-window display in which three information windows are presented simultaneously on the display. Window A shows the standard B-scan image, while window B shows an M-scan image of a Doppler two-dimensional color flow map. Window C is a user information window which communicates command selections to the user and facilitates the user's manual selections. FIG. 11B is a single-window optional display in which the entire display is used to present only a B-scan image. Optionally, the display can show both the B-mode and color doppler scans simultaneously by overlaying the two displays or by showing them side-by-side using a split screen feature.
  • FIG. 12 is functional block diagram of a preferred graphical user interface. A [0113] virtual control 400 includes an ultrasound image control display 410, a probe model properties display 420, and a probe specific properties display 500. The virtual control display 400 is preferably coded as dialog boxes in a Windows environment.
  • FIG. 13 illustrates a dialog box for the ultrasound image control [0114] 410. Through the ultrasound image control display 410, the user can select a probe head type 412, a zone display 414, a demodulation filter 416, and an algorithm option 418. The user also can initiate the ultrasound scan through this dialog box.
  • The probe model properties display [0115] 420 includes model type 425, safety information 430, image Integrated Pulse Amplitude (IPA) data 435, doppler IPA data 440, color IPA data 445, probe geometry 450, image zones data 455, doppler zones data 460, color zones data 465, image apodization 470, doppler apodization 475, and color apodization 480. These are preferably encoded as dialog boxes. Through the model-properties dialog box 425, a user can enter general settings for the probe model.
  • FIG. 14A illustrates a dialog box for entering a viewing probe model properties. Entered parameters are downloaded to the ultrasound probe. [0116]
  • FIG. 14B illustrates a dialog box for entering and [0117] viewing safety information 430. As illustrated, a user can enter general settings 432 and beam width table data 434 per governing standards.
  • FIG. 14C illustrates a dialog box for entering and viewing [0118] image IPA data 435. The dialog box displays beamformed output values, listed in volts as a function of image display zones for various drive voltages. Similar dialog boxes are used to enter the doppler and color IPA data 440, 445.
  • FIG. 14D illustrates a dialog box for effecting the [0119] image apodization function 470. As illustrated, the operator can enter and view general settings 472 and vector information 474. The user can select active elements for array windowing (or apodization).
  • The probe [0120] specific property display 500 includes dialog boxes for entering probe specifics 510, image Field Of-View (FOV) data 520, doppler FOV data 530, and color FOV data 540. Through the probe specifics dialog box 510, the user can enter general settings 512, imaging static information 514, doppler static information 516, and FOV settings 518.
  • FIG. 15A illustrates a dialog box for entering, and viewing probe specific information. Any number of probes can be supported. [0121]
  • FIG. 15B-[0122] 15C illustrate dialog boxes for entering image FOV data 520. As illustrated, a user can enter general settings 522, breakpoint PGC data 524, zone boundaries 526, and zone duration 528 data. Dialog boxes for the doppler and color FOV data displays 530,540 are similar and are all the entry of general settings 532, 542, breakpoint TGC data 534, 544, and PRF data 536, 546.
  • FIGS. [0123] 15D-15J illustrate additional windows and control panels for controlling an ultrasound imaging system in accordance with the invention. FIG. 15D shows a viewing window for the region of interest and a control panel situated side by side with the scan image. FIG. 15E shows controls for the doppler field of view and other selectable settings. FIG. 15F shows the color field of view controls. FIG. 15G shows properties of the probe. FIG. 15h shows the color IPA data for a probe. FIG. 151 shows the probe geometry settings for a linear array. FIG. 15J shows settings for doppler apodization.
  • FIG. 16 illustrates the zoom feature of a preferred embodiment of the imaging system in accordance with the invention. In this particular illustration detailed features of a phantom, or internal [0124] anatomical features 600 of a patient that are shown on screen 32, can be selected and enlarged within or over a display window. In this particular example, a region 602 is selected by the user and is enlarged at window 604. A plurality of such regions can be simultaneously enlarged and shown on screen 32 in separate or overlying windows. If two scan heads are in use, different views can be shown at the same time, or previously recorded images can be recalled from memory and displayed beside an image presented in real time.
  • The architecture of the integrated front-end probe approach was designed to provide small size, low power consumption and maximal flexibility in scanning, including: 1) multi-zone focus on transmission; 2) ability to drive a variety of probes, such as linear/curved linear, linear/trapezoidal, and sector scan; 3) ability to provide M-mode, B-mode, Color Flow Map and Doppler Sonogram displays; 4) multiple, selectable pulse shapes and frequencies; and 5) different firing sequences. Different embodiments for the integrated front-[0125] end system 700 are shown in FIGS. 17A, 17B and 17C. Modules unique to this invention are the blocks corresponding to: beamforming chip 702, transmit/receive chip 704, preamplifier/TGC chip 706.
  • The block labeled “front-end probe” (front-end controller) directly controls the routine operation of the ultrasound scan head by generating clock and control signals provided to [0126] modules 702, 704, 706 and to the memory unit 708. These signals are used to assure continuous data output and to indicate the module for which the data appearing at the memory-unit output are intended. Higher level control of the scan head 710, as well as initialization, data processing and display functions, are provided by a general purpose host computer 720, such as a desktop PC, laptop or palmtop. Thus, the front-end controller also interfaces with the host computer, e.g. via PCI bus or Fire Wire 714 to allow the host to write control data into the scanhead memory unit and receive data back. This is performed at initialization and whenever a change in parameters (such as number and/or position of zones or type of scan head) is required when the user selects a different scanning pattern. The front-end controller also provides buffering and flow-control functions, as data from the beamformer must be sent to the host via a bandwidth-constrained link, to prevent data loss.
  • The system described permits two different implementations of the Color Flow Map (CFM) and Doppler Sonogram (DS) functions. FIG. 17A shows a hardware-based [0127] 722 implementation, in which a dedicated Doppler-processing chip is mounted on a back-end card 724 and used as a co-processor to the host computer 720 to accomplish the CFM and DS computations. FIG. 17B shows a software implementation in which the CFM and DS computations are performed by the host computer.
  • FIG. 17C shows yet another system integration, in which the transducer array and the front-end processing units are not integrated into a single housing but are connected by coaxial cables. The front-end units include the front-end controller, the memory and the three modules [0128] 704 (transmit/receive chip), 706 (preamp/TGC chip) and 702 (the beamforming chip) as shown in the Figure.
  • “FireWire” refers to IEEE standard 1394, which provides high-speed data transmission over a serial link. This allows use of high-volume, low cost commercial parts for the interface. The standard supports an asynchronous data transfer mode that can be used to send commands and configuration data to the probe head memory. It can also be used to query the status of the head and obtain additional information, such as the activation of any buttons or other input devices on the head. Additionally, the asynchronous data transfer mode can be used to detect the type of probe head attached. An isochronous transfer mode can be used to transfer data back from the beamformer to the host. These data may come directly from the A/D or from the demodulator or some combination. If Doppler processing is placed in the probe head, the Doppler processed data can be sent via FireWire. Alternatively the data can be Doppler processed via software or hardware in the host. There also exists a wireless version of the FireWire standard, allowing communication via an optical link for untethered operation. This can be used to provide greater freedom when the probe head is attached to the host using wireless FireWire. [0129]
  • The preamp/TGC chip as implemented consists of integrated [0130] 32 parallel, low-noise, low-power, amplifier/TGC units. Each unit has 60-dB programmable gain, a noise voltage less than 1.5nV1{square root}{square root over (Hz)} and dissipates less than 11 mW per receiver channel.
  • As shown in FIG. 18, the multi-channel transmit/receive chip consists of a global counter, a global memory and a bank of parallel dual-channel transmit/receiver controllers. Within each [0131] controller 740, there are local memory 745, delay comparator, frequency counter & comparator, pulse counter & Comparator, phase selector, transmit/receive select/demux switch (T/R switch), and level shifter units.
  • The [0132] global counter 742 broadcasts a master clock and bit values to each channel processor 740. The global memory 744 controls transmit frequency, pulse number, pulse sequence and transmit/receive select. The local delay comparator 746 provides delay selection for each channel. For example, with a 60 MHZ clock, and a 10-bit global counter, a delay of up to 17 μs can be provided for each channel. The local frequency counter 748 provides programmable transmit frequency. A 4-bit counter with a comparator provides up to sixteen different frequency selections. For example, using a 60-MHZ master clock, a 4-bit counter can be programmed to provide different transmit frequencies such as 60/2=30 MHz, 60/3=20 MHz, 60/4=15 MHz, 60/5=12 MHz, 60/6=10 MHz and so on. The local pulse counter 750 provides different pulse sequences. For example, a 6-bit counter with a comparator can provide programmable transmitted pulse lengths from one pulse up to 64 pulses. The locally programmable phase selector which provides sub-clock delay resolution.
  • While typically the period of the transmit-chip determines the delay resolution, a technique called programmable subclock delay resolution allows the delay resolution to be more precise than the clock period. With programmable subclock delay resolution, the output of the frequency counter is gated with a phase of the clock that is programmable on a per-channel basis. In the simplest form, a two-phase clock is used and the output of the frequency counter is either gated with the asserted or deasserted clock. Alternatively, multiple skewed clocks can be used. One per channel can be selected and used to gate the coarse timing signal from the frequency counter. For example, for a 60-MHz master clock, a two-to-one phase selector provides 8-ns delay resolution and a four-to-one phase selector provides 4-ns delay resolution. [0133]
  • Also shown are the integrated transmit/receiver [0134] select switch 754, T/R switch and the integrated high-voltage level shifter 750 for the transmit pulses. A single-chip transmit/receive chip capable of handling 64 channel drivers and 32-channel receivers can be used, each channel having a controller as shown in FIG. 18.
  • In another implementation, shown in FIG. 19, the T/R select/mux switch and the high-voltage level shifter are separated from the [0135] other components 760 on a separate chip 762 to allow use of different high-voltage semiconductor technologies, such as high-breakdown silicon CMOS/JFET or GaAs technology for production of these components.
  • The basic method for pulsed-Doppler ultrasound imaging is illustrated in FIG. 20. The waveform consists of a burst of [0136] N pulses 770. After each pulse as many range (depth) samples as needed are collected. The time evolution of the velocity distribution of material within the range gate is displayed as a sonogram 772, a two-dimensional display in which the horizontal axis represents time and the vertical axis velocity (as assessed by Doppler shift). Different regions can be interrogated by moving the range gate and varying its size. A Doppler sonogram can be generated using single-range-gate Doppler processing, as shown in FIG. 20. The operation of this method is as follows. A sequence of N ultrasonic pulses is transmitted at a pulse repetition frequency fprf along a given viewing angle. The return echoes are range gated and only returns 774 from a single range bin are used, meaning that only the returned signals corresponding to a region at a selected distance (e.g. from depth d to d+δd) from the transducer array along the selected viewing angle are processed to extract Doppler information. The velocity profiles of scatterers in the selected region can be obtained by computing the Doppler shifts of the echoes received from the scatterers. That is, Fourier transformation 776 of the received time-domain signal provides frequency information, including the desired Doppler, fd. The velocity distribution of the scatterers in the region of interest can be obtained from the relationship: f d = 2 v c f c
    Figure US20030028113A1-20030206-M00001
  • where c is the speed of sound in the transmitting medium and f[0137] c, is the center frequency of the transducer. As an example, if N=16 and fprf=1 KHz, the above equation can be used to generate a sonogram 772 displaying 16 ms of Doppler data. If the procedure is repeated every N/fprf seconds, a continuous Doppler sonogram plot can be produced.
  • Another embodiment involves a pulse-Doppler process for color flow map applications. It is clinically desirable to be able to display flow rates and patterns over a large region in real time. One method for approaching this task using ultrasound is called color flow mapping (CFM). Color flow mapping techniques are an extension of the single-gated system described above. In CFM, velocities are estimated not only along a single direction or line segment, but over a number of directions (multiple scan lines) spanning a region of interest. The velocity information is typically color-coded (e.g. red indicates flow toward the transducer, blue away) and superimposed over a B-mode image that displays the underlaying anatomy. [0138]
  • A color-[0139] flow map 780 based on pulsed-Doppler processing is shown in FIG. 21. The basic single-range bin system of FIG. 20 can be extended to measure a number of range gates by sampling at different depths and retaining the samples in storage for additional processing. Note that this does not increase the acquisition time, as data are collected from the same RF line. Sweeping the beam over an area then makes it possible to assemble an image of the velocities in a 20 region of interest. In operation, the data from J range bins 782 along a single direction are processed in parallel. After N pulse returns are processed, the outputs represent a J×N range-vs-Doppler distribution, which in turn can be used to generate a J×N velocity distribution profile. The mean velocity at each depth dkk=1,2 . . . J, is used to generate a single point or cell on the color-flow map; in each cell, the standard deviation is used to assess turbulence. If the procedure is repeated every N/fprf seconds for every J range bins (e.g. spaced J/2 range bins apart) and for every scan line in the region of interest, a 2D color-flow map plot can be produced.
  • It is important to note that instead of an FFT-based computation, a cross correlation technique, as described in the publication of Jorgen A. Jensen, “Estimation of Blood Velocities Using Ultrasound,” University Press 1996, the contents of which is incorporated herein by reference, can also be used to produce a similar color flow map. [0140]
  • The range gate size and position can be determined by the user. This choice determines both the emitted pulse length and pulse repetition frequency. The size of the range gate is determined by the length of the pulse. The pulse duration is [0141]
  • T p=21g /C=M fo
  • if the gate length is l[0142] g, and M is the number of sine periods. The depth of the gate determines how quickly pulse echo lines can be acquired. The maximum rate is
  • f prf =c/2d o
  • where d[0143] o is the distance to the gate.
  • The generic waveform for the pulse-Doppler ultrasound imaging is shown in FIG. 22 where the waveform consists of a burst of [0144] N pulses 800. As many as range depth samples as needed are collected following each pulse in the burst. FIG. 22 also shows a block diagram 810 of a conventional signal processor for this imaging technique, where the returned echoes received by each transducer are sampled and coherently summed prior to in-phase and quadrature demodulation. The down converted/basebanded returns are converted to a digital representation, and then stored in a buffer memory until all the pulse returns comprising a coherent interval are received. The N pulse returns collected for each depth are then read from memory, a weighting sequence, v(n), is applied to control Doppler sidelobes, and an N-point FFT is computed. During the time the depth samples from one coherent interval are being processed through the Doppler filter, returns from the next coherent interval are being processed through the Doppler filter, returns from the next coherent interval are arriving and are stored in a second input buffer. The FFT 818 output is passed on to a display unit or by time averaging Doppler samples for subsequent display.
  • The CDP device described here performs all of the functions indicated in the dotted box of FIG. 22, except for A/D conversion, which is not necessary because the CDP device provides the analog sampled data function. This CDP Pulsed-Doppler Processor (PDP) device has the capability to compute a matrix-matrix product, and therefore has a much broader range of capabilities than needed to implement the functions shown within the dotted lines. [0145]
  • The PDP device computes the product of two real-valued matrices by summing the outer products formed by pairing columns of the first matrix with corresponding rows of the second matrix. [0146]
  • In order to describe the application of the PDP to the Doppler filtering problem, we first cast the Doppler filtering equation into a sum of real-valued matrix operations. The Doppler filtering is accomplished by computing a Discrete Fourier Transform (DFT) of the weighted pulse returns for each depth of interest. If we denote the depth-Doppler samples g(kj), where k is the Doppler index, O≦k≦N−1, and j is the depth index, then [0147] g ( k , j ) = n = 0 n - 1 v ( n ) f ( n , j ) exp ( - j2 π kn / N )
    Figure US20030028113A1-20030206-M00002
  • The weighting function can be combined with the DFT kernel to obtain a matrix of Doppler filter transform coefficients with elements given by [0148]
  • W(k,n)=W k,n =v(n)exp(−j2nkn/N)
  • The real and imaginary components of the Doppler filtered signal can now be written as [0149] g r , kj = n = 0 N = 1 ( W r , kn f r , nj - W i , kn f i , nj ) g r , kj = n = 0 N = 1 ( W r , kn f r , nj + W i , kn f i , nj )
    Figure US20030028113A1-20030206-M00003
  • In the above equations, the double-indexed variables may all be viewed as matrix indices. Therefore, in matrix representation, the Doppler filtering can be expressed as matrix product operation. It can be seen that the PDP device can be used to perform each of the four matrix multiplications, thereby implementing the Doppler filtering operation. [0150]
  • A block diagram of the PDP device described in this invention is shown in FIG. 22. The device includes a J-stage CCD tapped delay line, J CCD multiplying D/A converters (MDACs) J×K accumulators, a J×K Doppler sample buffer, and a parallel-in-serial out (PISO) output shift register. The MDACs share a common 8-bit digital input on which elements from the coefficient matrix are supplied. The tapped delay line performs the function of a sample-and hold, converting the continuous-time analog input signal to a sampled analog signal. [0151]
  • A two-[0152] PDP implementation 840 for color flow mapping in a ultrasound imaging system is shown in FIG. 23. In this device, during one pulse return interval, the top PDP component computes all the terms of the form Wk,fr and Wifr as shown in the above, while the bottom component computes the terms of the form −Wifi, and Wkfi. The outputs of each component are then summed to alternately obtain gr and gi.
  • Doppler and color flow map processing involves a significant amount of computation. This processing may be accomplished in software using a general-purpose microprocessor. The presence of instructions optimized for matrix-matrix operations, such as the Intel MMX feature set, can substantially improve performance. A software flow chart for color-flow map computation based on the FFT computation algorithm is shown in FIG. 24. After [0153] initialization 900, the downconverted data is obtained 902 and the pointer P is at the beginning of the scan line 904, the data is averaged and stored 906, a weighting function is applied 908, the FFT is computed 910, the magnitude z(k) is computed for each frequency 912 followed by the computation of first and second moments 914 and display thereof in color 916. The painter is incremented 918 and each scan line is processed as needed.
  • A software flow chart for color-flow map computation based on the cross-correlation computation is showing in FIG. 25. [0154]
  • After [0155] initiation 940, the scan line data is obtained 942, followed by the range bin data 944. The cross correlation is computed 946 and averaged 948, and the velocity distribution 950, first and second moments 952 are obtained and displayed 954. The range bin data is increased 956 and the process repeated.
  • While this invention has been particularly shown and described with references to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. [0156]

Claims (46)

What is claimed:
1. A method of processing image data with an ultrasound imaging device comprising of:
providing a portable ultrasound imaging system including a transducer array within a handheld probe, an interface unit connected to the handheld probe with a first interface, the interface unit having a beamforming device and being connected to a data processing system with a second interface;
transmitting data from the handheld probe to the interface unit with the first interface;
performing a beamforming operation with the beamforming device in the interface unit; and
transmitting data from the interface unit to the data processing system with the second interface such that the data processing system receives a beamformed electronic representation of the region of interest.
2. The method of claim 1 wherein the data processing system further comprises a portable computer having a flat panel display.
3. The method of claim 1 further comprising generating a colored image of the object.
4. The method of claim 1 further comprising providing a beamforming circuit including a programmable delay device.
5. The method of claim 1 further comprising a hand-held probe having a circuit board on which circuit elements are mounted, the circuit elements including a charged coupled device integrated circuit connected to an analog to digital converter.
6. The method of claim 1 wherein the step of providing a data processing system further comprises providing a battery powered portable computer having a graphical user interface.
7. The method of claim 1 further comprising displaying the image in one of a plurality of windows on a display connected to the data processing system.
8. The method of claim 1 further comprising performing a scan conversion with a remap array.
9. The method of claim 1 wherein the first interface comprises an electrical cable.
10. The method of claim 1 wherein the second interface comprises an electrical cable.
11. The method of claim 1 wherein the second interface comprises a wireless connection.
12. A portable ultrasound system for imaging a region of interest comprising:
a handheld probe housing in which a transducer array is mounted;
an interface unit connected to the handheld probe housing with a first interface, the interface unit including a beamforming device; and
a data processing system connected to the interface unit with a second interface such that the data processing system receives a beamformed representation of the region of interest.
13. The system of claim 12 further comprising a flat panel display connected to the data processing system that displays an image of the region of interest.
14. The system of claim 12 wherein the probe housing further comprises a beamforming circuit board having a programmable delay device.
15. The system of claim 12 further comprising a circuit board within the probe housing, the circuit board having a beamforming integrated circuit mounted thereon.
16. The system of claim 12 further comprising a display and a battery in the data processing system such that the battery provides power to the probe housing.
17. The system of claim 12 further comprising a digital signal processor in the interface unit.
18. The system of claim 12 wherein the data processing system comprises a personal computer having a graphical user interface.
19. The system of claim 12 further comprising a data transmitter that forwards isochronous data from the interface unit to the data processing unit.
20. The system of claim 12 wherein the second interface provides asynchronous data transfer between the interface unit and the data processing system.
21. The system of claim 12 wherein asynchronous signals are transmitted from the data processing unit to the interface unit.
22. The system of claim 12 wherein the transducer array comprises a phased array device.
23. The system of claim 12 wherein the beamforming device comprises 32 channels or more.
24. The system of claim 12 wherein the interface unit further comprises a transmit/receiver circuit and a preamplifier circuit.
25. The system of claim 12 wherein the interface unit further comprises a memory and a system controller.
26. The system of claim 12 wherein the beamforming device comprises a CDP beamformer.
27. The system of claim 12 wherein the data processor includes a scan conversion system and a high standard high speed communications port.
28. The system of claim 12 wherein the second interface comprises a wireless connection.
29. The system of claim 12 wherein the data processing system is programmed to perform scan conversion with a remap array.
30. A portable ultrasound system for imaging a region of interest comprising:
an ultrasound probe system including a transducer array and a beamforming device; and
a data processing system connected to the ultrasound probe system with an isochronous transfer connector from the beamforming device to the data processing system to provide a high-speed transmission link such that the data processing system receives a beamformed representation of the region of interest.
31. The system of claim 30 further comprising a flat panel display connected to the data processing system that displays an image of the region of interest.
32. The system of claim 30 wherein the probe system further comprises a beamforming circuit having a programmable delay device.
33. The system of claim 30 further comprising an interface unit connected to a probe housing and a circuit board within the interface unit, the circuit board having a beamforming integrated circuit mounted thereon.
34. The system of claim 33 further comprising a display and a battery in the data processing system such that the battery provides power to the probe housing.
35. The system of claim 33 further comprising a digital signal processor in the interface unit.
36. The system of claim 30 wherein the data processing system comprises a personal computer having a graphical user interface.
37. The system of claim 30 further comprising a data transmitter that forwards isochronous data from the probe system to the data processing unit.
38. The system claim of 33 wherein the transmission link provides a connection between the interface unit and the data processing unit.
39. The system of claim 30 wherein asynchronous signals are transmitted from the data processing system to the probe.
40. The system of claim 33 wherein the interface unit comprises a wireless interface.
41. The system of claim 30 wherein the transducer array comprises a phased array device.
42. The system of claim 30 wherein the probe system comprises a handheld housing having a transducer array and an interface unit.
43. The system of claim 30 wherein the probe comprises a beamformer device.
44. The system of claim 42 wherein the interface unit comprises a beamforming device.
45. The system of claim 30 wherein the transmission link comprises a wireless interface.
46. The system of claim 30 further comprising a first cable connecting the probe to an interface unit and a second cable for the transmission link between the interface unit and the data processor.
US10/114,183 1995-06-29 2002-04-02 Ultrasound scan conversion with spatial dithering Abandoned US20030028113A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/114,183 US20030028113A1 (en) 1995-06-29 2002-04-02 Ultrasound scan conversion with spatial dithering

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US08/496,804 US5590658A (en) 1995-06-29 1995-06-29 Portable ultrasound imaging system
US08/496,805 US5839442A (en) 1995-06-29 1995-06-29 Portable ultrasound imaging system
US08/599,816 US5690114A (en) 1995-06-29 1996-02-12 Portable ultrasound imaging system
PCT/US1996/011166 WO1997001768A2 (en) 1995-06-29 1996-06-28 Portable ultrasound imaging system
US08/773,647 US5904652A (en) 1995-06-29 1996-12-24 Ultrasound scan conversion with spatial dithering
PCT/US1997/024291 WO1998028631A2 (en) 1996-12-24 1997-12-23 Ultrasound scan conversion with spatial dithering
US09/203,877 US6248073B1 (en) 1995-06-29 1998-12-02 Ultrasound scan conversion with spatial dithering
US09/447,144 US6379304B1 (en) 1995-06-29 1999-11-23 Ultrasound scan conversion with spatial dithering
US10/114,183 US20030028113A1 (en) 1995-06-29 2002-04-02 Ultrasound scan conversion with spatial dithering

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/447,144 Continuation US6379304B1 (en) 1995-06-29 1999-11-23 Ultrasound scan conversion with spatial dithering

Publications (1)

Publication Number Publication Date
US20030028113A1 true US20030028113A1 (en) 2003-02-06

Family

ID=27504383

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/203,877 Expired - Lifetime US6248073B1 (en) 1995-06-29 1998-12-02 Ultrasound scan conversion with spatial dithering
US09/447,144 Expired - Lifetime US6379304B1 (en) 1995-06-29 1999-11-23 Ultrasound scan conversion with spatial dithering
US10/114,183 Abandoned US20030028113A1 (en) 1995-06-29 2002-04-02 Ultrasound scan conversion with spatial dithering

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/203,877 Expired - Lifetime US6248073B1 (en) 1995-06-29 1998-12-02 Ultrasound scan conversion with spatial dithering
US09/447,144 Expired - Lifetime US6379304B1 (en) 1995-06-29 1999-11-23 Ultrasound scan conversion with spatial dithering

Country Status (1)

Country Link
US (3) US6248073B1 (en)

Cited By (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030153816A1 (en) * 2002-02-11 2003-08-14 General Electric Company Method and system for conducting medical imaging transactions
US20040015079A1 (en) * 1999-06-22 2004-01-22 Teratech Corporation Ultrasound probe with integrated electronics
US20040158154A1 (en) * 2003-02-06 2004-08-12 Siemens Medical Solutions Usa, Inc. Portable three dimensional diagnostic ultrasound imaging methods and systems
WO2004086082A1 (en) * 2003-03-27 2004-10-07 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
US20040258127A1 (en) * 2003-06-23 2004-12-23 Siemens Medical Solutions Usa, Inc. Ultrasound transducer fault measurement method and system
US20050113690A1 (en) * 2003-11-25 2005-05-26 Nahi Halmann Methods and systems for providing portable device extended resources
WO2005059586A1 (en) * 2003-12-16 2005-06-30 Koninklijke Philips Electronics, N.V. Ultrasonic diagnostic imaging system with automatic control of penetration, resolution and frame rate
WO2006113445A1 (en) * 2005-04-14 2006-10-26 Verasonics, Inc. Ultrasound imaging system with pixel oriented processing
US7270634B2 (en) 2003-03-27 2007-09-18 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by high resolution three dimensional ultrasonic imaging
US20080086054A1 (en) * 2006-10-04 2008-04-10 Slayton Michael H Ultrasound system and method for imaging and/or measuring displacement of moving tissue and fluid
US20080154133A1 (en) * 2006-09-19 2008-06-26 Kabushiki Kaisha Toshiba Ultrasound diagnostic apparatus and method for acquiring ultrasound data
US20080298654A1 (en) * 2007-06-01 2008-12-04 Roth Scott L Temperature management for ultrasound imaging at high frame rates
US20090316974A1 (en) * 2008-06-20 2009-12-24 Kai Ji Data input method and ultrasonic imaging apparatus
US20090326379A1 (en) * 2008-06-26 2009-12-31 Ronald Elvin Daigle High frame rate quantitative doppler flow imaging using unfocused transmit beams
US20100094132A1 (en) * 2008-10-10 2010-04-15 Sonosite, Inc. Ultrasound system having a simplified user interface
US20100106020A1 (en) * 2008-10-28 2010-04-29 Soo-Hwan Shin Ultrasound System And Method Providing Wide Image Mode
KR100969545B1 (en) 2007-11-30 2010-07-12 주식회사 메디슨 Portable ultrasonic diagnostic apparatus having a display part swiveable vertically and horizontally
US20110112405A1 (en) * 2008-06-06 2011-05-12 Ulthera, Inc. Hand Wand for Ultrasonic Cosmetic Treatment and Imaging
US20110160582A1 (en) * 2008-04-29 2011-06-30 Yongping Zheng Wireless ultrasonic scanning system
US8166332B2 (en) 2005-04-25 2012-04-24 Ardent Sound, Inc. Treatment system for enhancing safety of computer peripheral for use with medical devices by isolating host AC power
US8235909B2 (en) 2004-05-12 2012-08-07 Guided Therapy Systems, L.L.C. Method and system for controlled scanning, imaging and/or therapy
US8282554B2 (en) 2004-10-06 2012-10-09 Guided Therapy Systems, Llc Methods for treatment of sweat glands
WO2012170714A1 (en) * 2011-06-08 2012-12-13 University Of Virginia Patent Foundation Separable beamforming for ultrasound array
US8366622B2 (en) 2004-10-06 2013-02-05 Guided Therapy Systems, Llc Treatment of sub-dermal regions for cosmetic effects
US8409097B2 (en) 2000-12-28 2013-04-02 Ardent Sound, Inc Visual imaging system for ultrasonic probe
US8444562B2 (en) 2004-10-06 2013-05-21 Guided Therapy Systems, Llc System and method for treating muscle, tendon, ligament and cartilage tissue
US8460193B2 (en) 2004-10-06 2013-06-11 Guided Therapy Systems Llc System and method for ultra-high frequency ultrasound treatment
US8480585B2 (en) 1997-10-14 2013-07-09 Guided Therapy Systems, Llc Imaging, therapy and temperature monitoring ultrasonic system and method
US8535228B2 (en) 2004-10-06 2013-09-17 Guided Therapy Systems, Llc Method and system for noninvasive face lifts and deep tissue tightening
US8636665B2 (en) 2004-10-06 2014-01-28 Guided Therapy Systems, Llc Method and system for ultrasound treatment of fat
US8663112B2 (en) 2004-10-06 2014-03-04 Guided Therapy Systems, Llc Methods and systems for fat reduction and/or cellulite treatment
US8690778B2 (en) 2004-10-06 2014-04-08 Guided Therapy Systems, Llc Energy-based tissue tightening
US8708935B2 (en) 2004-09-16 2014-04-29 Guided Therapy Systems, Llc System and method for variable depth ultrasound treatment
US8715186B2 (en) 2009-11-24 2014-05-06 Guided Therapy Systems, Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
US8764687B2 (en) 2007-05-07 2014-07-01 Guided Therapy Systems, Llc Methods and systems for coupling and focusing acoustic energy using a coupler member
US8858471B2 (en) 2011-07-10 2014-10-14 Guided Therapy Systems, Llc Methods and systems for ultrasound treatment
US8857438B2 (en) 2010-11-08 2014-10-14 Ulthera, Inc. Devices and methods for acoustic shielding
US8915870B2 (en) 2004-10-06 2014-12-23 Guided Therapy Systems, Llc Method and system for treating stretch marks
US9011336B2 (en) 2004-09-16 2015-04-21 Guided Therapy Systems, Llc Method and system for combined energy therapy profile
US9011337B2 (en) 2011-07-11 2015-04-21 Guided Therapy Systems, Llc Systems and methods for monitoring and controlling ultrasound power output and stability
US20150182201A1 (en) * 2013-12-31 2015-07-02 General Electric Company Ultrasound probe power supply
US9072471B2 (en) 2011-11-28 2015-07-07 Kabushiki Kaisha Toshiba Portable ultrasound diagnosis apparatus
US9114247B2 (en) 2004-09-16 2015-08-25 Guided Therapy Systems, Llc Method and system for ultrasound treatment with a multi-directional transducer
US9149658B2 (en) 2010-08-02 2015-10-06 Guided Therapy Systems, Llc Systems and methods for ultrasound treatment
US9216276B2 (en) 2007-05-07 2015-12-22 Guided Therapy Systems, Llc Methods and systems for modulating medicants using acoustic energy
US9263663B2 (en) 2012-04-13 2016-02-16 Ardent Sound, Inc. Method of making thick film transducer arrays
WO2016057622A1 (en) * 2014-10-07 2016-04-14 Butterfly Network, Inc. Ultrasound signal processing circuitry and related apparatus and methods
USD754357S1 (en) 2011-08-09 2016-04-19 C. R. Bard, Inc. Ultrasound probe head
US9504446B2 (en) 2010-08-02 2016-11-29 Guided Therapy Systems, Llc Systems and methods for coupling an ultrasound source to tissue
US9510802B2 (en) 2012-09-21 2016-12-06 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US9566454B2 (en) 2006-09-18 2017-02-14 Guided Therapy Systems, Llc Method and sysem for non-ablative acne treatment and prevention
US9694212B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, Llc Method and system for ultrasound treatment of skin
US9827449B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US10420960B2 (en) 2013-03-08 2019-09-24 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US10561862B2 (en) 2013-03-15 2020-02-18 Guided Therapy Systems, Llc Ultrasound treatment device and methods of use
US10603521B2 (en) 2014-04-18 2020-03-31 Ulthera, Inc. Band transducer ultrasound therapy
US10639008B2 (en) 2009-10-08 2020-05-05 C. R. Bard, Inc. Support and cover structures for an ultrasound probe head
US10820885B2 (en) 2012-06-15 2020-11-03 C. R. Bard, Inc. Apparatus and methods for detection of a removable cap on an ultrasound probe
US10864385B2 (en) 2004-09-24 2020-12-15 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US11103213B2 (en) 2009-10-08 2021-08-31 C. R. Bard, Inc. Spacers for use with an ultrasound probe
US11207548B2 (en) 2004-10-07 2021-12-28 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US11224895B2 (en) 2016-01-18 2022-01-18 Ulthera, Inc. Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof
US11235179B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc Energy based skin gland treatment
US11241218B2 (en) 2016-08-16 2022-02-08 Ulthera, Inc. Systems and methods for cosmetic ultrasound treatment of skin
US11657540B2 (en) 2019-06-24 2023-05-23 Darkvision Technologies Inc Compression of ultrasound data in fluid conduits
US11717661B2 (en) 2007-05-07 2023-08-08 Guided Therapy Systems, Llc Methods and systems for ultrasound assisted delivery of a medicant to tissue
US11724133B2 (en) 2004-10-07 2023-08-15 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US11883688B2 (en) 2004-10-06 2024-01-30 Guided Therapy Systems, Llc Energy based fat reduction
US11944849B2 (en) 2018-02-20 2024-04-02 Ulthera, Inc. Systems and methods for combined cosmetic treatment of cellulite with ultrasound

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6248073B1 (en) * 1995-06-29 2001-06-19 Teratech Corporation Ultrasound scan conversion with spatial dithering
US6575908B2 (en) * 1996-06-28 2003-06-10 Sonosite, Inc. Balance body ultrasound system
US7819807B2 (en) * 1996-06-28 2010-10-26 Sonosite, Inc. Balance body ultrasound system
US6569101B2 (en) * 2001-04-19 2003-05-27 Sonosite, Inc. Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use
US6962566B2 (en) * 2001-04-19 2005-11-08 Sonosite, Inc. Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use
KR100330855B1 (en) * 1999-02-09 2002-04-03 이민화 Digital-ultrasound imaging system of memorizing the pre-received signals and again using the memorized signals
US6735349B1 (en) * 1999-09-15 2004-05-11 Genesis Microchip Inc. Method and system for dual spatial or temporal scaling
US6413217B1 (en) * 2000-03-30 2002-07-02 Ge Medical Systems Global Technology Company, Llc Ultrasound enlarged image display techniques
US6440072B1 (en) * 2000-03-30 2002-08-27 Acuson Corporation Medical diagnostic ultrasound imaging system and method for transferring ultrasound examination data to a portable computing device
KR100388407B1 (en) * 2001-04-27 2003-06-25 주식회사 메디슨 Three-dimensional ultrasound imaging system for performing receiving focusing at voxels corresponding to display pixels
US20020165702A1 (en) * 2001-05-02 2002-11-07 Koninklijke Philips Electronics N.V. Method and system for optimization of apodization circuits
US6638226B2 (en) 2001-09-28 2003-10-28 Teratech Corporation Ultrasound imaging system
US6648826B2 (en) * 2002-02-01 2003-11-18 Sonosite, Inc. CW beam former in an ASIC
US7534211B2 (en) * 2002-03-29 2009-05-19 Sonosite, Inc. Modular apparatus for diagnostic ultrasound
US6907284B2 (en) * 2002-05-03 2005-06-14 Lms Medical Systems Ltd. Method and apparatus for displaying a fetal heart rate signal
US7255678B2 (en) * 2002-10-10 2007-08-14 Visualsonics Inc. High frequency, high frame-rate ultrasound imaging system
US7620220B2 (en) * 2003-03-21 2009-11-17 Boston Scientific Scimed, Inc. Scan conversion of medical imaging data from polar format to cartesian format
US6896657B2 (en) * 2003-05-23 2005-05-24 Scimed Life Systems, Inc. Method and system for registering ultrasound image in three-dimensional coordinate system
EP1695113A2 (en) 2003-11-26 2006-08-30 Teratech Corporation Modular portable ultrasound systems
US8257262B2 (en) * 2003-12-19 2012-09-04 Siemens Medical Solutions Usa, Inc. Ultrasound adaptor methods and systems for transducer and system separation
US7998072B2 (en) * 2003-12-19 2011-08-16 Siemens Medical Solutions Usa, Inc. Probe based digitizing or compression system and method for medical ultrasound
US7637871B2 (en) * 2004-02-26 2009-12-29 Siemens Medical Solutions Usa, Inc. Steered continuous wave doppler methods and systems for two-dimensional ultrasound transducer arrays
US7794400B2 (en) * 2004-02-26 2010-09-14 Siemens Medical Solutions Usa, Inc. Element mapping and transmitter for continuous wave ultrasound imaging
EP1728098A2 (en) * 2004-03-01 2006-12-06 Sunnybrook and Women's College Health Sciences Centre System and method for ecg-triggered retrospective color flow ultrasound imaging
US20050228281A1 (en) * 2004-03-31 2005-10-13 Nefos Thomas P Handheld diagnostic ultrasound system with head mounted display
US8900149B2 (en) 2004-04-02 2014-12-02 Teratech Corporation Wall motion analyzer
US20050264585A1 (en) * 2004-05-26 2005-12-01 Trombley Michael G Visual display transformation
US20060058654A1 (en) * 2004-08-24 2006-03-16 Gerois Di Marco System and method for providing a user interface for an ultrasound system
US7611463B2 (en) * 2004-10-28 2009-11-03 General Electric Company Ultrasound beamformer with high speed serial control bus packetized protocol
US8002708B2 (en) * 2005-01-11 2011-08-23 General Electric Company Ultrasound beamformer with scalable receiver boards
CN101163986A (en) * 2005-04-18 2008-04-16 皇家飞利浦电子股份有限公司 Portable ultrasonic diagnostic imaging system with docking station
WO2006111872A2 (en) * 2005-04-18 2006-10-26 Koninklijke Philips Electronics, N.V. Pc-based portable ultrasonic diagnostic imaging system
US8066642B1 (en) 2005-05-03 2011-11-29 Sonosite, Inc. Systems and methods for ultrasound beam forming data control
US20070167705A1 (en) * 2005-08-04 2007-07-19 Chiang Alice M Integrated ultrasound imaging system
US20070239019A1 (en) * 2006-02-13 2007-10-11 Richard William D Portable ultrasonic imaging probe than connects directly to a host computer
US8286079B2 (en) * 2006-09-19 2012-10-09 Siemens Medical Solutions Usa, Inc. Context aware user interface for medical diagnostic imaging, such as ultrasound imaging
US20090069682A1 (en) * 2007-01-24 2009-03-12 Hastings Harold M Simplified controls for implementing depth-based gain control in ultrasound systems
WO2008124841A2 (en) * 2007-04-10 2008-10-16 C. R. Bard, Inc. Low power ultrasound system
WO2009005743A1 (en) * 2007-06-29 2009-01-08 Teratech Corporation High-frequency tissue imaging devices and methods
CN101380237B (en) * 2007-09-04 2012-03-21 深圳迈瑞生物医疗电子股份有限公司 Scanning transform method and device for ultrasonic image-forming
JP2009268734A (en) * 2008-05-08 2009-11-19 Olympus Medical Systems Corp Ultrasound observation apparatus
WO2010020939A2 (en) * 2008-08-21 2010-02-25 Koninklijke Philips Electronics, N.V. Wireless ultrasound monitoring device
US9934554B2 (en) * 2010-11-15 2018-04-03 Indian Institute Of Technology Ultrasound imaging method/technique for speckle reduction/suppression in an improved ultra sound imaging system
WO2012120569A1 (en) * 2011-03-07 2012-09-13 パナソニック株式会社 Phase-to-digital conversion circuit and phase-to-digital converter provided therewith
JP5950599B2 (en) * 2011-11-07 2016-07-13 キヤノン株式会社 Subject information acquisition apparatus, subject information acquisition method, and program
US20130172757A1 (en) * 2012-01-04 2013-07-04 General Electric Company Multiple transducer automatic initiation
US10499878B2 (en) 2012-07-26 2019-12-10 Interson Corporation Portable ultrasonic imaging probe including a transducer array
US9691374B2 (en) 2012-11-28 2017-06-27 The Regents Of The University Of Michigan Processing a stream of ordered input data
JP2016501634A (en) * 2012-12-21 2016-01-21 ヴォルカノ コーポレイションVolcano Corporation Method for multi-frequency imaging using the output of a high bandwidth transducer
US10154826B2 (en) 2013-07-17 2018-12-18 Tissue Differentiation Intelligence, Llc Device and method for identifying anatomical structures
US10716536B2 (en) 2013-07-17 2020-07-21 Tissue Differentiation Intelligence, Llc Identifying anatomical structures
US9955950B2 (en) * 2014-07-30 2018-05-01 General Electric Company Systems and methods for steering multiple ultrasound beams
US10656254B2 (en) 2015-11-19 2020-05-19 Analog Devices, Inc. Analog ultrasound beamformer
US11701086B1 (en) 2016-06-21 2023-07-18 Tissue Differentiation Intelligence, Llc Methods and systems for improved nerve detection
US10935645B2 (en) * 2016-10-25 2021-03-02 George Mason University Method and apparatus for low-power ultraportable ultrasound imaging
US10469846B2 (en) 2017-03-27 2019-11-05 Vave Health, Inc. Dynamic range compression of ultrasound images
US10856843B2 (en) 2017-03-23 2020-12-08 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US11446003B2 (en) 2017-03-27 2022-09-20 Vave Health, Inc. High performance handheld ultrasound
US11531096B2 (en) 2017-03-23 2022-12-20 Vave Health, Inc. High performance handheld ultrasound
US10945706B2 (en) 2017-05-05 2021-03-16 Biim Ultrasound As Hand held ultrasound probe
KR20200100469A (en) * 2019-02-18 2020-08-26 삼성메디슨 주식회사 Analog beam former
US20230061122A1 (en) * 2021-08-24 2023-03-02 Saudi Arabian Oil Company Convex ultrasonic sensor for weld inspection

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3953668A (en) * 1975-05-27 1976-04-27 Bell Telephone Laboratories, Incorporated Method and arrangement for eliminating flicker in interlaced ordered dither images
US4092867A (en) * 1977-02-10 1978-06-06 Terrance Matzuk Ultrasonic scanning apparatus
US4152678A (en) * 1976-07-01 1979-05-01 Board of Trustees of the Leland Stanford Jr. Unv. Cascade charge coupled delay line device for compound delays
US4227417A (en) * 1977-06-13 1980-10-14 New York Institute Of Technology Dynamic focusing apparatus and method
US4401957A (en) * 1977-07-01 1983-08-30 Siemens Gammasonics, Inc. Permutating analog shift register variable delay system
US4689675A (en) * 1985-05-23 1987-08-25 Advanced Systems Development, Inc. Digital scan converter and method therefor
US4809184A (en) * 1986-10-22 1989-02-28 General Electric Company Method and apparatus for fully digital beam formation in a phased array coherent imaging system
US5109282A (en) * 1990-06-20 1992-04-28 Eye Research Institute Of Retina Foundation Halftone imaging method and apparatus utilizing pyramidol error convergence
US5127409A (en) * 1991-04-25 1992-07-07 Daigle Ronald E Ultrasound Doppler position sensing
US5199437A (en) * 1991-09-09 1993-04-06 Sensor Electronics, Inc. Ultrasonic imager
US5222391A (en) * 1991-08-07 1993-06-29 Reenstra Arthur L Tennis ball tester
US5257629A (en) * 1989-05-26 1993-11-02 Intravascular Research Limited Methods and apparatus for the examination and treatment of internal organs
US5272627A (en) * 1991-03-27 1993-12-21 Gulton Industries, Inc. Data converter for CT data acquisition system
US5286964A (en) * 1991-02-19 1994-02-15 Phoenix Laser Systems, Inc. System for detecting, correcting and measuring depth movement of a target
US5295485A (en) * 1991-12-13 1994-03-22 Hitachi, Ltd. Ultrasonic diagnostic system
US5345426A (en) * 1993-05-12 1994-09-06 Hewlett-Packard Company Delay interpolator for digital phased array ultrasound beamformers
US5355303A (en) * 1990-11-21 1994-10-11 Polaroid Corporation Printing apparatus
US5369497A (en) * 1990-11-21 1994-11-29 Allen; Janet A. Apparatus and method for modulating the area of pixels for tone reproduction
US5406949A (en) * 1994-07-18 1995-04-18 Siemens Medical System Digital processing for steerable CW doppler
US5415167A (en) * 1992-01-10 1995-05-16 Wilk; Peter J. Medical system and associated method for automatic diagnosis and treatment
US5435313A (en) * 1991-10-08 1995-07-25 Ge Yokogawa Medical Systems, Ltd. Ultrasonic probe
US5477305A (en) * 1990-12-04 1995-12-19 Research Corporation Technologies Method and apparatus for halftone rendering of a gray scale image using a blue noise mask
US5479594A (en) * 1993-09-10 1995-12-26 Ati Technologies Inc. Digital color video image enhancement for a diffusion dither circuit
US5535751A (en) * 1994-12-22 1996-07-16 Morphometrix Technologies Inc. Confocal ultrasonic imaging system
US5538004A (en) * 1995-02-28 1996-07-23 Hewlett-Packard Company Method and apparatus for tissue-centered scan conversion in an ultrasound imaging system
US5590658A (en) * 1995-06-29 1997-01-07 Teratech Corporation Portable ultrasound imaging system
US5722412A (en) * 1996-06-28 1998-03-03 Advanced Technology Laboratories, Inc. Hand held ultrasonic diagnostic instrument
US5758649A (en) * 1995-09-01 1998-06-02 Fujitsu Limited Ultrasonic module and ultrasonic diagnostic system
US5795297A (en) * 1996-09-12 1998-08-18 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with personal computer architecture
US5817024A (en) * 1996-06-28 1998-10-06 Sonosight, Inc. Hand held ultrasonic diagnostic instrument with digital beamformer
US5904652A (en) * 1995-06-29 1999-05-18 Teratech Corporation Ultrasound scan conversion with spatial dithering
US5947901A (en) * 1997-09-09 1999-09-07 Redano; Richard T. Method for hemodynamic stimulation and monitoring
US6248073B1 (en) * 1995-06-29 2001-06-19 Teratech Corporation Ultrasound scan conversion with spatial dithering

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5522391A (en) * 1994-08-09 1996-06-04 Hewlett-Packard Company Delay generator for phased array ultrasound beamformer
US5763785A (en) 1995-06-29 1998-06-09 Massachusetts Institute Of Technology Integrated beam forming and focusing processing circuit for use in an ultrasound imaging system

Patent Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3953668A (en) * 1975-05-27 1976-04-27 Bell Telephone Laboratories, Incorporated Method and arrangement for eliminating flicker in interlaced ordered dither images
US4152678A (en) * 1976-07-01 1979-05-01 Board of Trustees of the Leland Stanford Jr. Unv. Cascade charge coupled delay line device for compound delays
US4092867A (en) * 1977-02-10 1978-06-06 Terrance Matzuk Ultrasonic scanning apparatus
US4227417A (en) * 1977-06-13 1980-10-14 New York Institute Of Technology Dynamic focusing apparatus and method
US4401957A (en) * 1977-07-01 1983-08-30 Siemens Gammasonics, Inc. Permutating analog shift register variable delay system
US4689675A (en) * 1985-05-23 1987-08-25 Advanced Systems Development, Inc. Digital scan converter and method therefor
US4809184A (en) * 1986-10-22 1989-02-28 General Electric Company Method and apparatus for fully digital beam formation in a phased array coherent imaging system
US5257629A (en) * 1989-05-26 1993-11-02 Intravascular Research Limited Methods and apparatus for the examination and treatment of internal organs
US5109282A (en) * 1990-06-20 1992-04-28 Eye Research Institute Of Retina Foundation Halftone imaging method and apparatus utilizing pyramidol error convergence
US5355303A (en) * 1990-11-21 1994-10-11 Polaroid Corporation Printing apparatus
US5369497A (en) * 1990-11-21 1994-11-29 Allen; Janet A. Apparatus and method for modulating the area of pixels for tone reproduction
US5477305A (en) * 1990-12-04 1995-12-19 Research Corporation Technologies Method and apparatus for halftone rendering of a gray scale image using a blue noise mask
US5286964A (en) * 1991-02-19 1994-02-15 Phoenix Laser Systems, Inc. System for detecting, correcting and measuring depth movement of a target
US5272627A (en) * 1991-03-27 1993-12-21 Gulton Industries, Inc. Data converter for CT data acquisition system
US5127409A (en) * 1991-04-25 1992-07-07 Daigle Ronald E Ultrasound Doppler position sensing
US5222391A (en) * 1991-08-07 1993-06-29 Reenstra Arthur L Tennis ball tester
US5199437A (en) * 1991-09-09 1993-04-06 Sensor Electronics, Inc. Ultrasonic imager
US5435313A (en) * 1991-10-08 1995-07-25 Ge Yokogawa Medical Systems, Ltd. Ultrasonic probe
US5295485A (en) * 1991-12-13 1994-03-22 Hitachi, Ltd. Ultrasonic diagnostic system
US5415167A (en) * 1992-01-10 1995-05-16 Wilk; Peter J. Medical system and associated method for automatic diagnosis and treatment
US5345426A (en) * 1993-05-12 1994-09-06 Hewlett-Packard Company Delay interpolator for digital phased array ultrasound beamformers
US5479594A (en) * 1993-09-10 1995-12-26 Ati Technologies Inc. Digital color video image enhancement for a diffusion dither circuit
US5406949A (en) * 1994-07-18 1995-04-18 Siemens Medical System Digital processing for steerable CW doppler
US5535751A (en) * 1994-12-22 1996-07-16 Morphometrix Technologies Inc. Confocal ultrasonic imaging system
US5538004A (en) * 1995-02-28 1996-07-23 Hewlett-Packard Company Method and apparatus for tissue-centered scan conversion in an ultrasound imaging system
US5590658A (en) * 1995-06-29 1997-01-07 Teratech Corporation Portable ultrasound imaging system
US5904652A (en) * 1995-06-29 1999-05-18 Teratech Corporation Ultrasound scan conversion with spatial dithering
US6248073B1 (en) * 1995-06-29 2001-06-19 Teratech Corporation Ultrasound scan conversion with spatial dithering
US6379304B1 (en) * 1995-06-29 2002-04-30 Teratech Corporation Ultrasound scan conversion with spatial dithering
US5758649A (en) * 1995-09-01 1998-06-02 Fujitsu Limited Ultrasonic module and ultrasonic diagnostic system
US5722412A (en) * 1996-06-28 1998-03-03 Advanced Technology Laboratories, Inc. Hand held ultrasonic diagnostic instrument
US5817024A (en) * 1996-06-28 1998-10-06 Sonosight, Inc. Hand held ultrasonic diagnostic instrument with digital beamformer
US5795297A (en) * 1996-09-12 1998-08-18 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with personal computer architecture
US5947901A (en) * 1997-09-09 1999-09-07 Redano; Richard T. Method for hemodynamic stimulation and monitoring

Cited By (173)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9272162B2 (en) 1997-10-14 2016-03-01 Guided Therapy Systems, Llc Imaging, therapy, and temperature monitoring ultrasonic method
US8480585B2 (en) 1997-10-14 2013-07-09 Guided Therapy Systems, Llc Imaging, therapy and temperature monitoring ultrasonic system and method
US20040015079A1 (en) * 1999-06-22 2004-01-22 Teratech Corporation Ultrasound probe with integrated electronics
US11547382B2 (en) 1999-06-22 2023-01-10 Teratech Corporation Networked ultrasound system and method for imaging a medical procedure using an invasive probe
US9907535B2 (en) 2000-12-28 2018-03-06 Ardent Sound, Inc. Visual imaging system for ultrasonic probe
US8409097B2 (en) 2000-12-28 2013-04-02 Ardent Sound, Inc Visual imaging system for ultrasonic probe
US20030153816A1 (en) * 2002-02-11 2003-08-14 General Electric Company Method and system for conducting medical imaging transactions
US6931270B2 (en) * 2002-02-11 2005-08-16 General Electric Company Method and system for conducting medical imaging transactions
US20040158154A1 (en) * 2003-02-06 2004-08-12 Siemens Medical Solutions Usa, Inc. Portable three dimensional diagnostic ultrasound imaging methods and systems
WO2004086082A1 (en) * 2003-03-27 2004-10-07 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
JP2006521146A (en) * 2003-03-27 2006-09-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method and apparatus for guiding an invasive medical device by wide view three-dimensional ultrasound imaging
US20060182320A1 (en) * 2003-03-27 2006-08-17 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
US7529393B2 (en) 2003-03-27 2009-05-05 Koninklijke Philips Electronics, N.V. Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
US7270634B2 (en) 2003-03-27 2007-09-18 Koninklijke Philips Electronics N.V. Guidance of invasive medical devices by high resolution three dimensional ultrasonic imaging
US20040258127A1 (en) * 2003-06-23 2004-12-23 Siemens Medical Solutions Usa, Inc. Ultrasound transducer fault measurement method and system
US20070081576A1 (en) * 2003-06-23 2007-04-12 Siemens Medical Solutions Usa, Inc. Ultrasound transducer fault measurement method and system
US7156551B2 (en) * 2003-06-23 2007-01-02 Siemens Medical Solutions Usa, Inc. Ultrasound transducer fault measurement method and system
US7481577B2 (en) * 2003-06-23 2009-01-27 Siemens Medical Solutions Usa, Inc. Ultrasound transducer fault measurement method and system
US20050113690A1 (en) * 2003-11-25 2005-05-26 Nahi Halmann Methods and systems for providing portable device extended resources
US10925585B2 (en) * 2003-12-16 2021-02-23 Koninklijke Philips N.V. Ultrasonic diagnostic imaging system with automatic control of penetration, resolution and frame rate
WO2005059586A1 (en) * 2003-12-16 2005-06-30 Koninklijke Philips Electronics, N.V. Ultrasonic diagnostic imaging system with automatic control of penetration, resolution and frame rate
US20100010352A1 (en) * 2003-12-16 2010-01-14 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with automatic control of penetration, resolution and frame rate
US20070276236A1 (en) * 2003-12-16 2007-11-29 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with automatic control of penetration, resolution and frame rate
US8235909B2 (en) 2004-05-12 2012-08-07 Guided Therapy Systems, L.L.C. Method and system for controlled scanning, imaging and/or therapy
US9011336B2 (en) 2004-09-16 2015-04-21 Guided Therapy Systems, Llc Method and system for combined energy therapy profile
US10039938B2 (en) 2004-09-16 2018-08-07 Guided Therapy Systems, Llc System and method for variable depth ultrasound treatment
US8708935B2 (en) 2004-09-16 2014-04-29 Guided Therapy Systems, Llc System and method for variable depth ultrasound treatment
US9114247B2 (en) 2004-09-16 2015-08-25 Guided Therapy Systems, Llc Method and system for ultrasound treatment with a multi-directional transducer
US9895560B2 (en) 2004-09-24 2018-02-20 Guided Therapy Systems, Llc Methods for rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US9095697B2 (en) 2004-09-24 2015-08-04 Guided Therapy Systems, Llc Methods for preheating tissue for cosmetic treatment of the face and body
US11590370B2 (en) 2004-09-24 2023-02-28 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US10864385B2 (en) 2004-09-24 2020-12-15 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US10328289B2 (en) 2004-09-24 2019-06-25 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US9427600B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US10603523B2 (en) 2004-10-06 2020-03-31 Guided Therapy Systems, Llc Ultrasound probe for tissue treatment
US11883688B2 (en) 2004-10-06 2024-01-30 Guided Therapy Systems, Llc Energy based fat reduction
US8333700B1 (en) 2004-10-06 2012-12-18 Guided Therapy Systems, L.L.C. Methods for treatment of hyperhidrosis
US8366622B2 (en) 2004-10-06 2013-02-05 Guided Therapy Systems, Llc Treatment of sub-dermal regions for cosmetic effects
US8282554B2 (en) 2004-10-06 2012-10-09 Guided Therapy Systems, Llc Methods for treatment of sweat glands
US8444562B2 (en) 2004-10-06 2013-05-21 Guided Therapy Systems, Llc System and method for treating muscle, tendon, ligament and cartilage tissue
US8460193B2 (en) 2004-10-06 2013-06-11 Guided Therapy Systems Llc System and method for ultra-high frequency ultrasound treatment
US11717707B2 (en) 2004-10-06 2023-08-08 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US8506486B2 (en) 2004-10-06 2013-08-13 Guided Therapy Systems, Llc Ultrasound treatment of sub-dermal tissue for cosmetic effects
US8523775B2 (en) 2004-10-06 2013-09-03 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US8535228B2 (en) 2004-10-06 2013-09-17 Guided Therapy Systems, Llc Method and system for noninvasive face lifts and deep tissue tightening
US8636665B2 (en) 2004-10-06 2014-01-28 Guided Therapy Systems, Llc Method and system for ultrasound treatment of fat
US8641622B2 (en) 2004-10-06 2014-02-04 Guided Therapy Systems, Llc Method and system for treating photoaged tissue
US8663112B2 (en) 2004-10-06 2014-03-04 Guided Therapy Systems, Llc Methods and systems for fat reduction and/or cellulite treatment
US8672848B2 (en) 2004-10-06 2014-03-18 Guided Therapy Systems, Llc Method and system for treating cellulite
US8690778B2 (en) 2004-10-06 2014-04-08 Guided Therapy Systems, Llc Energy-based tissue tightening
US8690779B2 (en) 2004-10-06 2014-04-08 Guided Therapy Systems, Llc Noninvasive aesthetic treatment for tightening tissue
US8690780B2 (en) 2004-10-06 2014-04-08 Guided Therapy Systems, Llc Noninvasive tissue tightening for cosmetic effects
US11697033B2 (en) 2004-10-06 2023-07-11 Guided Therapy Systems, Llc Methods for lifting skin tissue
US11400319B2 (en) 2004-10-06 2022-08-02 Guided Therapy Systems, Llc Methods for lifting skin tissue
US11338156B2 (en) 2004-10-06 2022-05-24 Guided Therapy Systems, Llc Noninvasive tissue tightening system
US11235180B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US11235179B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc Energy based skin gland treatment
US11207547B2 (en) 2004-10-06 2021-12-28 Guided Therapy Systems, Llc Probe for ultrasound tissue treatment
US11179580B2 (en) 2004-10-06 2021-11-23 Guided Therapy Systems, Llc Energy based fat reduction
US8915870B2 (en) 2004-10-06 2014-12-23 Guided Therapy Systems, Llc Method and system for treating stretch marks
US8915854B2 (en) 2004-10-06 2014-12-23 Guided Therapy Systems, Llc Method for fat and cellulite reduction
US8915853B2 (en) 2004-10-06 2014-12-23 Guided Therapy Systems, Llc Methods for face and neck lifts
US8920324B2 (en) 2004-10-06 2014-12-30 Guided Therapy Systems, Llc Energy based fat reduction
US8932224B2 (en) 2004-10-06 2015-01-13 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US11167155B2 (en) 2004-10-06 2021-11-09 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10960236B2 (en) 2004-10-06 2021-03-30 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US10888717B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, Llc Probe for ultrasound tissue treatment
US10888716B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, Llc Energy based fat reduction
US10888718B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US9039619B2 (en) 2004-10-06 2015-05-26 Guided Therapy Systems, L.L.C. Methods for treating skin laxity
US10610705B2 (en) 2004-10-06 2020-04-07 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US10610706B2 (en) 2004-10-06 2020-04-07 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10603519B2 (en) 2004-10-06 2020-03-31 Guided Therapy Systems, Llc Energy based fat reduction
US10532230B2 (en) 2004-10-06 2020-01-14 Guided Therapy Systems, Llc Methods for face and neck lifts
US10525288B2 (en) 2004-10-06 2020-01-07 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US10265550B2 (en) 2004-10-06 2019-04-23 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US10252086B2 (en) 2004-10-06 2019-04-09 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10245450B2 (en) 2004-10-06 2019-04-02 Guided Therapy Systems, Llc Ultrasound probe for fat and cellulite reduction
US10238894B2 (en) 2004-10-06 2019-03-26 Guided Therapy Systems, L.L.C. Energy based fat reduction
US9283410B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9283409B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, Llc Energy based fat reduction
US10046181B2 (en) 2004-10-06 2018-08-14 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US10046182B2 (en) 2004-10-06 2018-08-14 Guided Therapy Systems, Llc Methods for face and neck lifts
US10010724B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US9320537B2 (en) 2004-10-06 2016-04-26 Guided Therapy Systems, Llc Methods for noninvasive skin tightening
US10010726B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US9421029B2 (en) 2004-10-06 2016-08-23 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US10010725B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, Llc Ultrasound probe for fat and cellulite reduction
US9427601B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, Llc Methods for face and neck lifts
US10010721B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, L.L.C. Energy based fat reduction
US9440096B2 (en) 2004-10-06 2016-09-13 Guided Therapy Systems, Llc Method and system for treating stretch marks
US9974982B2 (en) 2004-10-06 2018-05-22 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US9833640B2 (en) 2004-10-06 2017-12-05 Guided Therapy Systems, L.L.C. Method and system for ultrasound treatment of skin
US9833639B2 (en) 2004-10-06 2017-12-05 Guided Therapy Systems, L.L.C. Energy based fat reduction
US9522290B2 (en) 2004-10-06 2016-12-20 Guided Therapy Systems, Llc System and method for fat and cellulite reduction
US9533175B2 (en) 2004-10-06 2017-01-03 Guided Therapy Systems, Llc Energy based fat reduction
US9827450B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9827449B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US9694212B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, Llc Method and system for ultrasound treatment of skin
US9694211B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US9700340B2 (en) 2004-10-06 2017-07-11 Guided Therapy Systems, Llc System and method for ultra-high frequency ultrasound treatment
US9707412B2 (en) 2004-10-06 2017-07-18 Guided Therapy Systems, Llc System and method for fat and cellulite reduction
US9713731B2 (en) 2004-10-06 2017-07-25 Guided Therapy Systems, Llc Energy based fat reduction
US11724133B2 (en) 2004-10-07 2023-08-15 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US11207548B2 (en) 2004-10-07 2021-12-28 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US20090112095A1 (en) * 2005-04-14 2009-04-30 Verasonics, Inc. Ultrasound imaging system with pixel oriented processing
US9028411B2 (en) 2005-04-14 2015-05-12 Verasonics, Inc. Ultrasound imaging system with pixel oriented processing
US9649094B2 (en) 2005-04-14 2017-05-16 Verasonics, Inc. Ultrasound imaging system with pixel oriented processing
WO2006113445A1 (en) * 2005-04-14 2006-10-26 Verasonics, Inc. Ultrasound imaging system with pixel oriented processing
US8287456B2 (en) 2005-04-14 2012-10-16 Verasonics, Inc. Ultrasound imaging system with pixel oriented processing
US8166332B2 (en) 2005-04-25 2012-04-24 Ardent Sound, Inc. Treatment system for enhancing safety of computer peripheral for use with medical devices by isolating host AC power
US8868958B2 (en) 2005-04-25 2014-10-21 Ardent Sound, Inc Method and system for enhancing computer peripheral safety
US9566454B2 (en) 2006-09-18 2017-02-14 Guided Therapy Systems, Llc Method and sysem for non-ablative acne treatment and prevention
US9316725B2 (en) * 2006-09-19 2016-04-19 Kabushiki Kaisha Toshiba Ultrasound diagnostic apparatus and method for acquiring ultrasound data
US20080154133A1 (en) * 2006-09-19 2008-06-26 Kabushiki Kaisha Toshiba Ultrasound diagnostic apparatus and method for acquiring ultrasound data
US20080086054A1 (en) * 2006-10-04 2008-04-10 Slayton Michael H Ultrasound system and method for imaging and/or measuring displacement of moving tissue and fluid
US9241683B2 (en) * 2006-10-04 2016-01-26 Ardent Sound Inc. Ultrasound system and method for imaging and/or measuring displacement of moving tissue and fluid
US9216276B2 (en) 2007-05-07 2015-12-22 Guided Therapy Systems, Llc Methods and systems for modulating medicants using acoustic energy
US11717661B2 (en) 2007-05-07 2023-08-08 Guided Therapy Systems, Llc Methods and systems for ultrasound assisted delivery of a medicant to tissue
US8764687B2 (en) 2007-05-07 2014-07-01 Guided Therapy Systems, Llc Methods and systems for coupling and focusing acoustic energy using a coupler member
WO2008150992A1 (en) * 2007-06-01 2008-12-11 Imacor Llc Temperature management for ultrasound imaging at high frame rates
US20080298654A1 (en) * 2007-06-01 2008-12-04 Roth Scott L Temperature management for ultrasound imaging at high frame rates
US8165364B2 (en) 2007-06-01 2012-04-24 Imacor Inc. Temperature management for ultrasound imaging at high frame rates
KR100969545B1 (en) 2007-11-30 2010-07-12 주식회사 메디슨 Portable ultrasonic diagnostic apparatus having a display part swiveable vertically and horizontally
US20110160582A1 (en) * 2008-04-29 2011-06-30 Yongping Zheng Wireless ultrasonic scanning system
US20110112405A1 (en) * 2008-06-06 2011-05-12 Ulthera, Inc. Hand Wand for Ultrasonic Cosmetic Treatment and Imaging
US11723622B2 (en) 2008-06-06 2023-08-15 Ulthera, Inc. Systems for ultrasound treatment
US11123039B2 (en) 2008-06-06 2021-09-21 Ulthera, Inc. System and method for ultrasound treatment
US10537304B2 (en) 2008-06-06 2020-01-21 Ulthera, Inc. Hand wand for ultrasonic cosmetic treatment and imaging
US20090316974A1 (en) * 2008-06-20 2009-12-24 Kai Ji Data input method and ultrasonic imaging apparatus
US8965073B2 (en) * 2008-06-20 2015-02-24 Ge Medical Systems Global Technology Company, Llc Data input method and ultrasonic imaging apparatus
US10914826B2 (en) 2008-06-26 2021-02-09 Verasonics, Inc. High frame rate quantitative doppler flow imaging using unfocused transmit beams
US20090326379A1 (en) * 2008-06-26 2009-12-31 Ronald Elvin Daigle High frame rate quantitative doppler flow imaging using unfocused transmit beams
US20100094132A1 (en) * 2008-10-10 2010-04-15 Sonosite, Inc. Ultrasound system having a simplified user interface
US20100106020A1 (en) * 2008-10-28 2010-04-29 Soo-Hwan Shin Ultrasound System And Method Providing Wide Image Mode
US10639008B2 (en) 2009-10-08 2020-05-05 C. R. Bard, Inc. Support and cover structures for an ultrasound probe head
US11103213B2 (en) 2009-10-08 2021-08-31 C. R. Bard, Inc. Spacers for use with an ultrasound probe
US9345910B2 (en) 2009-11-24 2016-05-24 Guided Therapy Systems Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
US9039617B2 (en) 2009-11-24 2015-05-26 Guided Therapy Systems, Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
US8715186B2 (en) 2009-11-24 2014-05-06 Guided Therapy Systems, Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
US9504446B2 (en) 2010-08-02 2016-11-29 Guided Therapy Systems, Llc Systems and methods for coupling an ultrasound source to tissue
US10183182B2 (en) 2010-08-02 2019-01-22 Guided Therapy Systems, Llc Methods and systems for treating plantar fascia
US9149658B2 (en) 2010-08-02 2015-10-06 Guided Therapy Systems, Llc Systems and methods for ultrasound treatment
US8857438B2 (en) 2010-11-08 2014-10-14 Ulthera, Inc. Devices and methods for acoustic shielding
US9433398B2 (en) * 2011-06-08 2016-09-06 University Of Virginia Patent Foundation Separable beamforming for ultrasound array
US20140200456A1 (en) * 2011-06-08 2014-07-17 University Of Virginia Patent Foundation Separable beamforming for ultrasound array
WO2012170714A1 (en) * 2011-06-08 2012-12-13 University Of Virginia Patent Foundation Separable beamforming for ultrasound array
US10159464B2 (en) 2011-06-08 2018-12-25 University Of Virginia Patent Foundation Separable beamforming for ultrasound array
US9452302B2 (en) 2011-07-10 2016-09-27 Guided Therapy Systems, Llc Systems and methods for accelerating healing of implanted material and/or native tissue
US8858471B2 (en) 2011-07-10 2014-10-14 Guided Therapy Systems, Llc Methods and systems for ultrasound treatment
US9011337B2 (en) 2011-07-11 2015-04-21 Guided Therapy Systems, Llc Systems and methods for monitoring and controlling ultrasound power output and stability
USD754357S1 (en) 2011-08-09 2016-04-19 C. R. Bard, Inc. Ultrasound probe head
US9072471B2 (en) 2011-11-28 2015-07-07 Kabushiki Kaisha Toshiba Portable ultrasound diagnosis apparatus
US9263663B2 (en) 2012-04-13 2016-02-16 Ardent Sound, Inc. Method of making thick film transducer arrays
US10820885B2 (en) 2012-06-15 2020-11-03 C. R. Bard, Inc. Apparatus and methods for detection of a removable cap on an ultrasound probe
US9802063B2 (en) 2012-09-21 2017-10-31 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US9510802B2 (en) 2012-09-21 2016-12-06 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US10420960B2 (en) 2013-03-08 2019-09-24 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US11517772B2 (en) 2013-03-08 2022-12-06 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US10561862B2 (en) 2013-03-15 2020-02-18 Guided Therapy Systems, Llc Ultrasound treatment device and methods of use
US20150182201A1 (en) * 2013-12-31 2015-07-02 General Electric Company Ultrasound probe power supply
US10603521B2 (en) 2014-04-18 2020-03-31 Ulthera, Inc. Band transducer ultrasound therapy
US11351401B2 (en) 2014-04-18 2022-06-07 Ulthera, Inc. Band transducer ultrasound therapy
WO2016057622A1 (en) * 2014-10-07 2016-04-14 Butterfly Network, Inc. Ultrasound signal processing circuitry and related apparatus and methods
US10371804B2 (en) 2014-10-07 2019-08-06 Butterfly Network, Inc. Ultrasound signal processing circuitry and related apparatus and methods
US20170307741A1 (en) * 2014-10-07 2017-10-26 Butterfly Network, Inc. Ultrasound signal processing circuitry and related apparatus and methods
AU2015328134B2 (en) * 2014-10-07 2018-08-02 Butterfly Network, Inc. Ultrasound signal processing circuitry and related apparatus and methods
JP2017534358A (en) * 2014-10-07 2017-11-24 バタフライ ネットワーク,インコーポレイテッド Ultrasonic signal processing circuit and related apparatus and method
TWI683121B (en) * 2014-10-07 2020-01-21 美商蝴蝶網路公司 Ultrasound signal processing circuitry and related apparatus and methods
US11224895B2 (en) 2016-01-18 2022-01-18 Ulthera, Inc. Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof
US11241218B2 (en) 2016-08-16 2022-02-08 Ulthera, Inc. Systems and methods for cosmetic ultrasound treatment of skin
US11944849B2 (en) 2018-02-20 2024-04-02 Ulthera, Inc. Systems and methods for combined cosmetic treatment of cellulite with ultrasound
US11657540B2 (en) 2019-06-24 2023-05-23 Darkvision Technologies Inc Compression of ultrasound data in fluid conduits

Also Published As

Publication number Publication date
US6379304B1 (en) 2002-04-30
US6248073B1 (en) 2001-06-19

Similar Documents

Publication Publication Date Title
US6379304B1 (en) Ultrasound scan conversion with spatial dithering
EP0949976B1 (en) Ultrasound scan conversion with spatial dithering
WO1998028631A9 (en) Ultrasound scan conversion with spatial dithering
US5839442A (en) Portable ultrasound imaging system
US5590658A (en) Portable ultrasound imaging system
US5722412A (en) Hand held ultrasonic diagnostic instrument
US6406430B1 (en) Ultrasound image display by combining enhanced flow imaging in B-mode and color flow mode
US8241217B2 (en) Portable ultrasound imaging data
US7500952B1 (en) Portable ultrasound imaging system
US7399279B2 (en) Transmitter patterns for multi beam reception
US6106472A (en) Portable ultrasound imaging system
US6416475B1 (en) Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US20130281863A1 (en) Portable ultrasound imaging system
JPH11508461A (en) Portable ultrasonic imaging system
JP5496865B2 (en) Portable ultrasound imaging system
EP0996002A2 (en) Method and apparatus for edge enhancement in ultrasound imaging
US6740034B2 (en) Three-dimensional ultrasound imaging system for performing receive-focusing at voxels corresponding to display pixels
AU8361801A (en) Ultrasound scan conversion with spatial dithering

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE