WO2015193822A1 - Method and device for measuring vegetation cover on farmland - Google Patents
Method and device for measuring vegetation cover on farmland Download PDFInfo
- Publication number
- WO2015193822A1 WO2015193822A1 PCT/IB2015/054561 IB2015054561W WO2015193822A1 WO 2015193822 A1 WO2015193822 A1 WO 2015193822A1 IB 2015054561 W IB2015054561 W IB 2015054561W WO 2015193822 A1 WO2015193822 A1 WO 2015193822A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- image
- sectors
- coordinates
- area
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
Definitions
- the invention relates to a method for measuring or estimating vegetation cover on land, in particular on farmland, and the related device for performing this measurement.
- the wavelengths useful for this purpose are typically those in the red (spectral range around 690 nm) and near infrared (spectral range from around 780 nm to around 1000 nm) spectrum.
- NDVI Normalised Difference Vegetation Index
- the NDVI is based on variations in the optical reflectivity of plants and ground at different wavelengths.
- the NDVI is calculated as:
- NDVI ((NIR)-(RED))/((NIR+RED))
- the ground reflects more light in the red spectrum (RED) than in the near infrared spectrum (NIR), while plants reflect more NIR than RED, as the chlorophyll in plants absorbs a great deal of visible red light.
- RED red spectrum
- NIR near infrared spectrum
- NDVI measurements can be obtained with specific sensors using various methods.
- One of these sensors is, for example, the sensor “GreenSeeker”, marketed by the company Trimble Navigation.
- GreenSeeker is a system of active sensors provided with its own light source, the reflected fraction of which is scanned at a distance of around one metre from the vegetation cover.
- Crop Circle is provided with an active lighting system that only partly solves the aforesaid problems.
- this device is still affected by problems related to the different angles with which the plants are framed during movement of the vehicle.
- this sensor is also influenced both by the position of the sun with respect to the sensor and to the area of land to be analysed and by weather conditions.
- the object of the present invention is to propose a device for measuring vegetation cover on land that overcomes the problems of the prior art described above.
- an object of the present invention is to propose a method for measuring vegetation cover, in particular on farmland, which is not particularly sensitive to light conditions, to weather conditions and to the relative position between the sensor and the plants to be analysed.
- an object of the invention is to propose a method, and a device, for measuring vegetation cover in which for a same portion of vegetation cover several images taken from different angles are compared.
- Another object of the present invention is to propose a method for measuring vegetation cover that provides a vegetation cover index in real time, while the image sensor is moved inside the land to be analysed.
- a further object of the present invention is to provide a device for measuring vegetation cover that enables further environmental parameters to be acquired, in order to correct, in real time, the vegetation cover values obtained through the image sensor.
- another object of the present invention is also to produce an inexpensive device made with consumer electronics.
- Each image is in turn sub-divided into several sectors for which a vegetation cover index is calculated based on the chromatic components of the pixels.
- the previously calculated cover indices are compared and a correct cover index of the corresponding area of land is determined.
- the index is no longer determined by a reading taken with a specific angle between sensor and area of vegetation cover, but is obtained by comparing images taken from different angles during movement of the sensor.
- the method for measuring vegetation cover on farmland comprises the following steps:
- the senor is moved in a straight line direction at a speed of between 3 km/h and 15 km/h and the images (In) are acquired at a frequency of between lHz and 5Hz.
- the optical axis of the sensor is inclined at an angle (a) of around 45° with respect to a horizontal plane, and at an angle ( ⁇ ) of around 90° with respect to a vertical plane parallel to the direction (X) of movement of the sensor.
- the area of the image (An) has a trapezoidal shape with the larger base further from the trajectory of movement of the sensor with respect to the smaller base.
- the image (In) is divided into a number of sectors (Sn-i) equal to 10. This number can clearly vary depending on the size of the corresponding area on the ground (An) or based on the precision required in determination of the cover index.
- the point (BSn-i) that identifies each sector is in the centre of gravity of the corresponding area on the ground.
- the maximum distance (DSi) between the respective coordinates of the identifying points (BSn-i) of the group of sectors (Sk-i) is less than a pre-set value calculated based on the geometric positioning parameters of the sensor with respect to the ground.
- a device for measuring the vegetation cover of an area of farmland comprising:
- At least one image sensor At least one image sensor
- control unit connected to said sensor
- control unit in which the control unit is configured to perform the steps of the method from a) to i) listed above.
- control unit comprises a local unit and a central unit, where the local unit is configured to perform the steps of the method a), b), d) f), g) and to send the vegetation cover values (Cn-i) for the sectors (Sn-i) to the central unit, while the central unit is configured to perform the remaining steps c), e), h), i).
- the device is also provided with one or more of the following sensors:
- FIG. 1 is a schematic front view of a vehicle on which the device of the invention is mounted;
- Fig. 2 is a top view of the vehicle of Fig. 1 ;
- - Fig. 3 is a schematic view of the area framed by the sensor
- Fig. 4 is a schematic view that represents several areas framed in subsequent samplings
- FIG. 5 schematically represents the device according to a variant of the invention.
- a pair of sensors Tl, T2 each positioned in such a manner as to frame, from above, an area A to the side of the vehicle V.
- Said sensors Tl, T2 are preferably frame digital image sensors of RGB type.
- the device can also have a single image sensor facing only one side of the vehicle V or even more than two sensors.
- vehicle V is intended as any vehicle or any means capable of moving the sensors through land in at least one direction.
- the sensors Tl, T2 are connected to each other rigidly and inclined at an angle a of around 45° with respect to a horizontal plane passing through the axis Y that connects the centres of focus C of the two sensors Tl, T2 (Fig. 1).
- optical axes O of the two sensors Tl, T2 are also orthogonal to the horizontal axis X parallel to the direction of movement M of the vehicle V.
- the optical axis O of the sensors is oriented at an angle ⁇ of around 90° with respect to a vertical plane passing through said axis X.
- the sensors Tl, T2 are installed at a height H from the ground of between 2 m and 4 m.
- each sensor its height from the ground and its FOV (Field Of View), appropriately combined, allow framing of an area A of around 10 m in width to the side of the vehicle.
- the ground projection of the sensitive area of the sensor Tl, T2 is represented by a trapezium having the smaller base b with a length of around 2 m on the side closest to the vehicle and the larger base B with a length of around 10 m on the opposite side.
- the device also comprises a control unit P to store and process the images taken by the sensors Tl, T2.
- Said control unit P can be included, for example, in a computer or in a mobile device, such as a Smartphone or a tablet, suitably programmed.
- the control unit P is preferably positioned on the vehicle but, if provided with wireless transmission means (Wi-Fi or Bluetooth®, or other equivalent technology), it can be positioned elsewhere.
- wireless transmission means Wi-Fi or Bluetooth®, or other equivalent technology
- the device also comprises a GPS device R positioned on the vehicle V and connected to the control unit P.
- the antenna of the GPS device R is arranged so that its position with respect to the sensors Tl, T2 is known and defined.
- the control unit is configured to acquire images In of the ground framed by the sensors Tl, T2, with a constant frequency preferably between 1 Hz and 5 Hz, while the sensors are moved on the ground in the direction X.
- these sampling frequencies are sufficient when the speed of movement of the sensor is between 3 km/h and 15 km/h.
- the direction X is parallel to the direction of movement M.
- control unit P calculates and attributes a "Vegetation" value (V) or "Other” value (O) to each pixel in the image In depending on the chromatic parameters detected in the image.
- sub-division into the two categories is based on analysis of the chromatic components related to each pixel, calculated starting from the R, G, B values of each pixel.
- pixels marked with "Other” correspond to the ground, branches of the plants or degraded leaf parts.
- the threshold for recognition of the vegetation value can be varied and determined according to needs, such as type of plant, characteristic colour of the leaves, etc.
- Typical methods for performing this distinction use supervised and unsupervised classification algorithms combined with histogram slicing and histogram stretching techniques by searching for values corresponding to the "Jenks Natural Breaks" within the values of the chromatic components.
- the control unit P stores and associates the coordinate of the GPS antenna R and, knowing the distance between this and the sensors, calculates the coordinate (Xn, Yn, Zn) of the centre of focus C of the sensor.
- the control unit P then processes the image In, sub-dividing it into a number I of sectors Sn-i, where i is the index of the i-th sector of an area An.
- the number I of sectors is preferably equal to 10.
- Fig. 3 schematically represents sub-division of the images into 10 sectors Sn-i.
- control unit P is configured to sub-divide the area AN so that all the sectors Sn-i have an area Asn-i corresponding to an area of equal surface on the ground.
- the sub-division into the sectors Sn-i is performed so as to overcome image distortion caused by the perspective of the image.
- control unit P starting from the coordinates (Xn, Yn, Zn) of the sensors Tl, T2, calculates the coordinates (XBn-i, YBn-i) of an identifying point BSn-i of each sector Sn-i, at a height equal to the ground.
- the identifying point BSn-i corresponds to the centre of gravity of the corresponding area on the ground of each sector Sn-i (Fig.3).
- the control unit P therefore calculates for each sector Sn-i a vegetation cover index Cn-i based on the number of pixels that were marked with the Vegetation value with respect to the total number of pixels in the sector Sn-i.
- control unit P compares the cover index Cn-i of an i-th sector with other cover indices obtained from previous image samplings.
- the steps described above are in fact repeated a number N of times, where N is equal to or greater than 2, while the sensors Tl, T2 are moved on the ground, for example during movement of the vehicle V.
- the sensors are moved in the direction X.
- a characteristic of the invention is therefore that of comparing the vegetation cover values Cn-i of a same area on the ground, obtained by images taken in different instants in different spatial positions (Xn, Yn, Zn) and therefore with different angles of the sensor with respect to a given area of land.
- control unit P compares the coordinates (XBn-i, YBn-i) of the identifying points BSn-i of several sectors Sn-i and determines a group of sectors comparable when the distance of the respective coordinates is below a maximum distance DSi.
- control unit establishes when the sectors Sn-i are sufficiently overlapped to compare the related cover indices Cn-i.
- a method for determining the maximum distance Dsi between the respective coordinates of the identifying points BSn-i of the group of sectors Sk-i can be calculated from the geometric positioning parameters of the system on the vehicle and therefore from the average distance of the identifying points from those closest to them, within the scope of analysis of a single frame.
- Fig. 4 schematically shows a sequence of areas An taken in several samplings in which the identifying points BSn-i are more or less close to one another.
- the control unit P performing statistical analysis of the vegetation cover index Cn-i values calculated for the single sectors belonging to the aforesaid group, attributes the estimate of a vegetation cover index Cn-i to an area on the ground deriving from the envelop of the areas corresponding to the areas on the ground previously calculated for each of the sectors belonging to the single group of sectors to which the estimate Cn-i refers.
- the number of sectors Sn-i increases as they move away from the axis of movement of the vehicle. This is because, as the measurements of sectors Sn-i closer to the sensor (i.e. closer to the axis of the vehicle) are more nadiral, they are typically affected less by the aforesaid disturbances.
- the value can be used to calculate, using further algorithms, the substances, such as fertilisers, plant protection products or also water, and the quantities of these substances, required by the plants in a given area of land.
- these calculations can be performed by the control unit P in real time, so as to be able to directly control distribution means of these substances mounted on the vehicle.
- control unit P comprises a local unit L connected directly to each sensor and a central unit H.
- the local unit L is capable of performing the following steps:
- a vegetation cover index Cn-i based on the relation between the number of pixels classed as Vegetation and the total number of pixels contained in the sector Sn-i;
- the central unit H is connected to the GPS antenna.
- the central unit H can then perform the subsequent steps of the method:
- the central unit H is connected to “Memory” storage media to store all the information and the values calculated, and preferably, it is provided with “Bluetooth®” wireless communication means and with a “Wi-Fi” Internet connection to communicate with other devices, such as the means for distribution of the substances or other remote electronic appliances.
- “Memory” storage media to store all the information and the values calculated, and preferably, it is provided with “Bluetooth®” wireless communication means and with a “Wi-Fi” Internet connection to communicate with other devices, such as the means for distribution of the substances or other remote electronic appliances.
- all the information and the parameters can be managed by an operator interface, such as a touch screen or the same display of a Smartphone or of a tablet, in the case in which the control unit P, or at least the central unit H, makes use of the CPU of these devices.
- an operator interface such as a touch screen or the same display of a Smartphone or of a tablet, in the case in which the control unit P, or at least the central unit H, makes use of the CPU of these devices.
- the device is also provided with further sensors whose parameters are used to perform further corrections in the calculation of the quantity of substances to supply to a given vegetation area.
- These sensors can comprise, for example, an ambient temperature sensor, a soil (or plant) temperature sensor, a humidity sensor or an ultrasound sensor.
- Variants of the invention can provide for the use of one or more of the sensors listed above.
Abstract
The invention relates to a method for measuring vegetation cover in which by means of an image sensor, translated at least in a direction above said land to be analysed, several images of a portion of said land are acquired. Each image is in turn sub-divided into several sectors for which a vegetation cover index is calculated based on the chromatic components of the pixels. Monitoring the position of the sensor by means of a GPS system and knowing the geometrical parameters of the image, the position on the land of the various areas to which the sectors of each image correspond is calculated. For a given number of sectors whose areas on the ground are sufficiently overlapped, the cover indices previously calculated are compared and a correct cover index of the corresponding area on the land is determined. The invention also relates to a device configured to implement the aforesaid method.
Description
TITLE
"METHOD AND DEVICE FOR MEASURING VEGETATION COVER ON FARMLAND"
DESCRIPTION
The invention relates to a method for measuring or estimating vegetation cover on land, in particular on farmland, and the related device for performing this measurement.
Currently there are known in agriculture systems for monitoring and promoting the development of plants during the growing season.
In this field, two of the most important factors that can be monitored and managed directly by the farmer, both in time and in space, are the availability of nutrients and the availability of water.
In fact, detailed knowledge of the spatial and temporal variability of these factors allows optimisation of the quality and/or quantity of agricultural production, in many cases with very limited or practically no costs.
This is achieved using indices representative of the state of health of the plant (or of a group of plants in a given area) and based on which it is possible to decide, for example, when to fertilise and how much fertiliser to supply in a given area of farmland or to establish if the water supply in that area is sufficient, too little or too much.
With this type of control, on the one hand it is possible to obtain maximum benefits for the plant and, on the other, to avoid excessive and unnecessary pollution of the ground caused by the use of excessive doses of fertiliser or other products for the treatment of plants.
Known systems for obtaining these indices are based mainly on the measurement of light reflected by the leaves of the plants.
These measurements are performed using electromagnetic sensors in a given range of wavelengths. Based on precise readings of the light reflected in these wavelengths, it is
possible to obtain useful information on the physiological and physiopathological state of the plants being monitored.
The wavelengths useful for this purpose are typically those in the red (spectral range around 690 nm) and near infrared (spectral range from around 780 nm to around 1000 nm) spectrum.
The most well-known and consolidated of the various indices is the NDVI (Normalised Difference Vegetation Index).
The NDVI is based on variations in the optical reflectivity of plants and ground at different wavelengths. The NDVI is calculated as:
NDVI = ((NIR)-(RED))/((NIR+RED))
The ground reflects more light in the red spectrum (RED) than in the near infrared spectrum (NIR), while plants reflect more NIR than RED, as the chlorophyll in plants absorbs a great deal of visible red light.
NDVI measurements can be obtained with specific sensors using various methods.
For example, by processing aerial or satellite images, it is possible to rapidly generate NDVI maps that cover vast areas.
In recent years the use of sensors has also become widespread for NDVI measurements on a local scale, through specific sensors mounted on vehicles that pass through the farmland to be analysed.
One of these sensors is, for example, the sensor "GreenSeeker", marketed by the company Trimble Navigation.
GreenSeeker is a system of active sensors provided with its own light source, the reflected fraction of which is scanned at a distance of around one metre from the vegetation cover.
A device such as GreenSeeker is described in the patents US 5,296,702, US 5,389,781 and US 5,585,626.
However, NDVI measurement depends greatly both on the sun-sensor-target geometry and on weather conditions (presence of clouds, mist, humidity or presence of water on the leaves, etc.).
Moreover, being a sensor that operates in movement with respect to the target and to the direction of the sun, it is susceptible to reading disturbances, which make it difficult to compare various measurements, performed at different times, within the same area of land.
Another known sensor is "Crop Circle" by Holland Scientific. A sensor of this kind is described in US 2005/0098713.
Crop Circle is provided with an active lighting system that only partly solves the aforesaid problems.
In fact, this device is still affected by problems related to the different angles with which the plants are framed during movement of the vehicle.
Another known sensor is "N-Sensor" by Yara International, described in WO 2013/087052.
As for the previous sensors, this sensor is also influenced both by the position of the sun with respect to the sensor and to the area of land to be analysed and by weather conditions.
In this context, the object of the present invention is to propose a device for measuring vegetation cover on land that overcomes the problems of the prior art described above.
Therefore, an object of the present invention is to propose a method for measuring vegetation cover, in particular on farmland, which is not particularly sensitive to light conditions, to weather conditions and to the relative position between the sensor and the plants to be analysed.
In particular, an object of the invention is to propose a method, and a device, for measuring vegetation cover in which for a same portion of vegetation cover several images taken from different angles are compared.
Another object of the present invention is to propose a method for measuring vegetation cover that provides a vegetation cover index in real time, while the image sensor is moved inside the land to be analysed.
A further object of the present invention is to provide a device for measuring vegetation cover that enables further environmental parameters to be acquired, in order to correct, in real time, the vegetation cover values obtained through the image sensor.
Besides the aforesaid objects, another object of the present invention is also to produce an inexpensive device made with consumer electronics.
These objects are achieved by a method in which several images of a portion of said land to be analysed are acquired by means of an image sensor, translated at least in one direction above said land.
Each image is in turn sub-divided into several sectors for which a vegetation cover index is calculated based on the chromatic components of the pixels.
Monitoring the position of the sensor by means of a GPS system and knowing the geometrical parameters of the image, the position on the land of the various areas to which the sectors of each image taken correspond is calculated.
For a given number of sectors whose areas on the ground are sufficiently overlapped, the previously calculated cover indices are compared and a correct cover index of the corresponding area of land is determined.
In this way, the index is no longer determined by a reading taken with a specific angle between sensor and area of vegetation cover, but is obtained by comparing images taken from different angles during movement of the sensor.
According to the invention, the method for measuring vegetation cover on farmland comprises the following steps:
a) acquisition by means of at least one image sensor having a centre of focus with coordinates (Xn, Yn, Zn), of at least one image (In) of an area of land;
b) attribution of a Vegetation value (V) or Other value (O) to each pixel in the image (In), depending on the chromatic components of the image;
c) determination by means of a GPS device of the coordinates (Χη,Υη,Ζη) of the centre of focus of the image sensor;
d) sub-division of the image (In) into a number (I) of sectors (Sn-i) each corresponding to a given area (ASn-i) on the land;
e) determination of the coordinates (XBn-i, YBn-i) of an identifying point (BSn-i), on the ground, of each sector (Sn-i), depending on the coordinates (Xn, Υη,Ζη) of the sensor; f) determination for each sector (Sn-i) of a vegetation cover index (Cn-i) based on the relation between the number of pixels classed as Vegetation (V) and the total number of pixels contained in the sector (Sn-i);
g) repeating steps a) to f) to acquire a number (N) of images (In), each image being acquired with the centre of focus of the sensor whose coordinates (Xn, Yn, Zn) do not coincide with those of the previous points;
h) comparison of the coordinates (XBn-i, YBn-i) of the identifying points (BSn-i) of several sectors (Sn-i) and determination of a group of sectors (Sk-i) in which the distances between the respective coordinates are smaller than a maximum distance (DSi);
i) determination and attribution, based on the cover values (Cn-i) calculated for each sector (Sk-i) in the group, of a vegetation cover index (Ci) for an area of land deriving from the overlapping of the areas corresponding to the sectors (Sk-i) of the aforementioned group.
In a preferred aspect of the invention, the sensor is moved in a straight line direction at a speed of between 3 km/h and 15 km/h and the images (In) are acquired at a frequency of between lHz and 5Hz.
By combining these frequencies with the speeds indicated (typical of an agricultural
vehicle moving on the ground), it is possible to obtain a quantity of information sufficient to monitor with extreme precision all the area to the side of the vehicle that falls within the field of view of the sensor.
According to a further aspect of the invention, the optical axis of the sensor is inclined at an angle (a) of around 45° with respect to a horizontal plane, and at an angle (β) of around 90° with respect to a vertical plane parallel to the direction (X) of movement of the sensor.
According to another aspect of the invention, the area of the image (An) has a trapezoidal shape with the larger base further from the trajectory of movement of the sensor with respect to the smaller base.
According to yet another aspect of the invention, the image (In) is divided into a number of sectors (Sn-i) equal to 10. This number can clearly vary depending on the size of the corresponding area on the ground (An) or based on the precision required in determination of the cover index.
According to a variant of the invention, the point (BSn-i) that identifies each sector is in the centre of gravity of the corresponding area on the ground.
Moreover, according to an aspect of the invention, the maximum distance (DSi) between the respective coordinates of the identifying points (BSn-i) of the group of sectors (Sk-i) is less than a pre-set value calculated based on the geometric positioning parameters of the sensor with respect to the ground.
The objects set are also achieved by a device for measuring the vegetation cover of an area of farmland, comprising:
at least one image sensor;
a control unit connected to said sensor;
- a GPS device connected to the control unit;
in which the control unit is configured to perform the steps of the method from a) to i)
listed above.
In a preferred aspect of the invention, the control unit comprises a local unit and a central unit, where the local unit is configured to perform the steps of the method a), b), d) f), g) and to send the vegetation cover values (Cn-i) for the sectors (Sn-i) to the central unit, while the central unit is configured to perform the remaining steps c), e), h), i).
According to a variant of the invention, the device is also provided with one or more of the following sensors:
humidity;
ambient temperature;
- soil temperature;
ultrasound.
Further characteristics and advantages of the present invention will become more apparent from the description of an example of a preferred, but not exclusive, embodiment, as illustrated in the accompanying figures, wherein:
- Fig. 1 is a schematic front view of a vehicle on which the device of the invention is mounted;
- Fig. 2 is a top view of the vehicle of Fig. 1 ;
- Fig. 3 is a schematic view of the area framed by the sensor;
- Fig. 4 is a schematic view that represents several areas framed in subsequent samplings;
- Fig. 5 schematically represents the device according to a variant of the invention.
With reference to Figs. 1 and 2, on a vehicle V, schematised, there is mounted a pair of sensors Tl, T2 each positioned in such a manner as to frame, from above, an area A to the side of the vehicle V.
Said sensors Tl, T2 are preferably frame digital image sensors of RGB type.
However, the device can also have a single image sensor facing only one side of the
vehicle V or even more than two sensors.
The term vehicle V is intended as any vehicle or any means capable of moving the sensors through land in at least one direction.
Preferably, the sensors Tl, T2 are connected to each other rigidly and inclined at an angle a of around 45° with respect to a horizontal plane passing through the axis Y that connects the centres of focus C of the two sensors Tl, T2 (Fig. 1).
The optical axes O of the two sensors Tl, T2 are also orthogonal to the horizontal axis X parallel to the direction of movement M of the vehicle V. In practice, the optical axis O of the sensors is oriented at an angle β of around 90° with respect to a vertical plane passing through said axis X.
Preferably, the sensors Tl, T2 are installed at a height H from the ground of between 2 m and 4 m.
The inclination of each sensor, its height from the ground and its FOV (Field Of View), appropriately combined, allow framing of an area A of around 10 m in width to the side of the vehicle. More in detail, the ground projection of the sensitive area of the sensor Tl, T2 is represented by a trapezium having the smaller base b with a length of around 2 m on the side closest to the vehicle and the larger base B with a length of around 10 m on the opposite side.
The device also comprises a control unit P to store and process the images taken by the sensors Tl, T2.
Said control unit P can be included, for example, in a computer or in a mobile device, such as a Smartphone or a tablet, suitably programmed.
The control unit P is preferably positioned on the vehicle but, if provided with wireless transmission means (Wi-Fi or Bluetooth®, or other equivalent technology), it can be positioned elsewhere.
The device also comprises a GPS device R positioned on the vehicle V and connected
to the control unit P.
In detail, the antenna of the GPS device R is arranged so that its position with respect to the sensors Tl, T2 is known and defined.
The control unit is configured to acquire images In of the ground framed by the sensors Tl, T2, with a constant frequency preferably between 1 Hz and 5 Hz, while the sensors are moved on the ground in the direction X.
Typically, these sampling frequencies are sufficient when the speed of movement of the sensor is between 3 km/h and 15 km/h.
When the sensors are mounted on the vehicle V, the direction X is parallel to the direction of movement M.
Subsequently, the control unit P calculates and attributes a "Vegetation" value (V) or "Other" value (O) to each pixel in the image In depending on the chromatic parameters detected in the image.
More in detail, sub-division into the two categories is based on analysis of the chromatic components related to each pixel, calculated starting from the R, G, B values of each pixel.
All the pixels marked with "vegetation" therefore correspond to points in which vegetation cover, and in particular the leafy part of plants, is present. More in detail, vegetation cover will be recognised as such when it has a given state of health, for example, when its colour reaches a given shade of green.
Instead, pixels marked with "Other" correspond to the ground, branches of the plants or degraded leaf parts.
The threshold for recognition of the vegetation value can be varied and determined according to needs, such as type of plant, characteristic colour of the leaves, etc.
Typical methods for performing this distinction use supervised and unsupervised classification algorithms combined with histogram slicing and histogram stretching
techniques by searching for values corresponding to the "Jenks Natural Breaks" within the values of the chromatic components.
According to the invention, for each image In framed at the n-th sampling, the control unit P stores and associates the coordinate of the GPS antenna R and, knowing the distance between this and the sensors, calculates the coordinate (Xn, Yn, Zn) of the centre of focus C of the sensor.
The control unit P then processes the image In, sub-dividing it into a number I of sectors Sn-i, where i is the index of the i-th sector of an area An. According to the invention, the number I of sectors is preferably equal to 10.
Fig. 3 schematically represents sub-division of the images into 10 sectors Sn-i.
In a preferred aspect of the invention, the control unit P is configured to sub-divide the area AN so that all the sectors Sn-i have an area Asn-i corresponding to an area of equal surface on the ground.
In practice, the sub-division into the sectors Sn-i is performed so as to overcome image distortion caused by the perspective of the image.
At this point, the control unit P, starting from the coordinates (Xn, Yn, Zn) of the sensors Tl, T2, calculates the coordinates (XBn-i, YBn-i) of an identifying point BSn-i of each sector Sn-i, at a height equal to the ground.
Preferably, the identifying point BSn-i corresponds to the centre of gravity of the corresponding area on the ground of each sector Sn-i (Fig.3).
The control unit P therefore calculates for each sector Sn-i a vegetation cover index Cn-i based on the number of pixels that were marked with the Vegetation value with respect to the total number of pixels in the sector Sn-i.
To allow a reduction in errors caused by variations of lighting conditions, of frame angles and of other factors, the control unit P compares the cover index Cn-i of an i-th sector with other cover indices obtained from previous image samplings.
According to the invention, the steps described above are in fact repeated a number N of times, where N is equal to or greater than 2, while the sensors Tl, T2 are moved on the ground, for example during movement of the vehicle V.
According to the representation in the figure, the sensors are moved in the direction X. A characteristic of the invention is therefore that of comparing the vegetation cover values Cn-i of a same area on the ground, obtained by images taken in different instants in different spatial positions (Xn, Yn, Zn) and therefore with different angles of the sensor with respect to a given area of land.
More in detail, the control unit P compares the coordinates (XBn-i, YBn-i) of the identifying points BSn-i of several sectors Sn-i and determines a group of sectors comparable when the distance of the respective coordinates is below a maximum distance DSi.
In practice, the control unit establishes when the sectors Sn-i are sufficiently overlapped to compare the related cover indices Cn-i.
According to the invention, a method for determining the maximum distance Dsi between the respective coordinates of the identifying points BSn-i of the group of sectors Sk-i can be calculated from the geometric positioning parameters of the system on the vehicle and therefore from the average distance of the identifying points from those closest to them, within the scope of analysis of a single frame.
Fig. 4 schematically shows a sequence of areas An taken in several samplings in which the identifying points BSn-i are more or less close to one another.
The control unit P, performing statistical analysis of the vegetation cover index Cn-i values calculated for the single sectors belonging to the aforesaid group, attributes the estimate of a vegetation cover index Cn-i to an area on the ground deriving from the envelop of the areas corresponding to the areas on the ground previously calculated for each of the sectors belonging to the single group of sectors to which the estimate Cn-i
refers.
As can be seen in Fig. 4, the number of sectors Sn-i increases as they move away from the axis of movement of the vehicle. This is because, as the measurements of sectors Sn-i closer to the sensor (i.e. closer to the axis of the vehicle) are more nadiral, they are typically affected less by the aforesaid disturbances.
As they move away from the sensor in the direction Y, the lower precision of the measurements is compensated by the possibility of comparing several index measurements Cn-i of the related group.
With the method of the present invention, performed by the device described above, it is possible to obtain an estimation of the vegetation cover index, of an area of land, appropriately corrected to cancel, or in any case reduce to a minimum, errors caused by the angles at which the images of the plants are taken and by different conditions of light or other disturbance phenomena mentioned above.
Once the estimate of the vegetation cover value is known, the value can be used to calculate, using further algorithms, the substances, such as fertilisers, plant protection products or also water, and the quantities of these substances, required by the plants in a given area of land.
According to the invention, these calculations can be performed by the control unit P in real time, so as to be able to directly control distribution means of these substances mounted on the vehicle.
According to a preferred variant of the invention, schematised in Fig. 5, the control unit P comprises a local unit L connected directly to each sensor and a central unit H.
According to the invention, the local unit L is capable of performing the following steps:
- acquisition by means of a sensor Tl, T2 of an image In corresponding to an area An of land to be analysed;
sub-division of the image In into a number I of sectors Sn-i;
for each sector Sn-i, attribution of the "Vegetation" value or "Other" value to each pixel;
for each sector Sn-i, determination of a vegetation cover index Cn-i based on the relation between the number of pixels classed as Vegetation and the total number of pixels contained in the sector Sn-i;
repeating the operations above a number N of times;
sending the vegetation cover values Cn-i to the central unit H.
The central unit H is connected to the GPS antenna.
The central unit H can then perform the subsequent steps of the method:
determination of the coordinates XBn-i, YBn-i of an identifying point BSn-i, at a height equal to the ground, of each sector Sn-i, depending on the coordinates Χη,Υη,Ζη of the sensor Tl, T2;
comparison of the coordinates XBn-i, YBn-i of the identifying points BSn-i of several sectors Sn-i and determination of a group of sectors Sk-i in which the distances between the respective coordinates are smaller than a maximum distance DSi;
determination and attribution of a vegetation cover index Ci for an area of land deriving from the envelope of the areas corresponding to the sectors Sk-i of the aforementioned group, based on the cover values Cn-i calculated for each sector.
Preferably, the central unit H is connected to "Memory" storage media to store all the information and the values calculated, and preferably, it is provided with "Bluetooth®" wireless communication means and with a "Wi-Fi" Internet connection to communicate with other devices, such as the means for distribution of the substances or other remote electronic appliances.
Advantageously, all the information and the parameters can be managed by an operator interface, such as a touch screen or the same display of a Smartphone or of a
tablet, in the case in which the control unit P, or at least the central unit H, makes use of the CPU of these devices.
According to a variant of the invention, the device is also provided with further sensors whose parameters are used to perform further corrections in the calculation of the quantity of substances to supply to a given vegetation area.
These sensors can comprise, for example, an ambient temperature sensor, a soil (or plant) temperature sensor, a humidity sensor or an ultrasound sensor.
Variants of the invention can provide for the use of one or more of the sensors listed above.
The invention has been described for illustrative and non-limiting purposes according to some preferred embodiments thereof. Those skilled in the art may find numerous other embodiments and variants, all falling within the scope of protection of the claims below.
Claims
1. A method for measuring vegetation cover on farmland comprising the following steps: a) acquisition by means of at least one image sensor (Tl) having a centre of focus (C) with coordinates (Xn, Yn, Zn), of at least one image (In) of an area (An) of land;
b) attribution of a Vegetation value (V) or Other value (O) to each pixel in the image
(In), depending on the chromatic components of the image;
c) determination by means of a GPS device of the coordinates (Χη,Υη,Ζη) of the centre of focus of the image sensor (Tl, T2);
d) sub-division of the image (In) into a number (I) of sectors (Sn-i) each corresponding to a given area (ASn-i) on the land;
e) determination of the coordinates (XBn-i, YBn-i) of an identifying point (BSn-i), on the ground, of each sector (Sn-i), depending on the coordinates (Χη,Υη,Ζη) of the sensor (T1, T2);
f) determination for each sector (Sn-i) of a vegetation cover index (Cn-i) based on the relation between the number of pixels classed as Vegetation (V) and the total number of pixels contained in the sector (Sn-i);
g) repeating steps a) to f) to acquire a number (N) of images (In), each image being acquired with the centre of focus (C) of the sensor (Tl, T2) whose coordinates (Xn, Yn, Zn) do not coincide with those of the previous points;
h) comparison of the coordinates (XBn-i, YBn-i) of the identifying points (BSn-i) of several sectors (Sn-i) and determination of a group of sectors (Sk-i) in which the distances between the respective coordinates are smaller than a maximum distance (DSi);
i) determination and attribution, based on the cover values (Cn-i) calculated for each sector (Sk-i) in the group, of a vegetation cover index (Ci) for an area of land deriving from the overlapping of the areas corresponding to the sectors (Sk-i) of the afore-
mentioned group.
2. Method according to claim 1 , wherein said at least one sensor (Tl, T2) is moved in a straight line direction (X) at a speed of between 3 km/h and 15 km/h, and the images (In) are acquired at a frequency of between lHz and 5Hz
3. Method according to claim 1 or 2, wherein the optical axis (O) of the sensor is inclined at an angle (a) of around 45° with respect to a horizontal plane, and at an angle (β) of around 90° with respect to a vertical plane parallel to the direction (X) of movement of the sensor (Tl, T2).
4. Method according to any of the previous claims, wherein the area (An) has a trapezoidal shape with the larger base (B) further from the trajectory (X) of movement of the sensor (Tl , T2) with respect to the smaller base (b).
5. Method according to any of the previous claims, wherein the number of sectors (Sn-i) of each image (In) is equal to 10.
6. Method according to any of the previous claims, wherein the identifying point (BSn-i) is found at the centre of gravity of the corresponding area on the ground.
7. Method according to any of the previous claims, wherein the maximum distance (DSi) between the respective coordinates of the identifying points (BSn-i) of the group of sectors (Sk-i) is less than a pre-set value calculated based on the geometric positioning parameters of the sensor with respect to the ground.
8. Device for measuring the vegetation cover of an area of farmland, comprising:
at least one image sensor (Tl, T2);
a control unit (P) connected to said sensor (Tl , T2);
a GPS device (R) connected to the control unit (P);
in which the control unit is configured to perform the steps according to any of the previous claims from 1 to 7.
9. A device according to claim 8, in which the control unit comprises a local unit (L) and
a central unit (H), where the local unit (L) is configured to perform the steps of the method a), b), d) f), g) and to send the vegetation cover values (Cn-i) for the sectors (Sn-i) to the central unit (H), while the central unit (H) is configured to perform the remaining steps c), e), h), i).
10. Device according to claim 8 or 9, further comprising one or more of the following sensors:
humidity;
ambient temperature;
soil temperature;
- ultrasound
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ITPC20140014 | 2014-06-17 | ||
ITPC2014A000014 | 2014-06-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015193822A1 true WO2015193822A1 (en) | 2015-12-23 |
Family
ID=51454831
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2015/054561 WO2015193822A1 (en) | 2014-06-17 | 2015-06-17 | Method and device for measuring vegetation cover on farmland |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015193822A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018050580A1 (en) * | 2016-09-16 | 2018-03-22 | Bayer Cropscience Aktiengesellschaft | Determination of the requirements on plant protection agents |
WO2018073060A1 (en) | 2016-10-18 | 2018-04-26 | Bayer Cropscience Aktiengesellschaft | Planning and implementing agricultural measures |
CN111868566A (en) * | 2019-10-11 | 2020-10-30 | 安徽中科智能感知产业技术研究院有限责任公司 | Agricultural machine working area measuring and calculating method based on positioning drift measuring and calculating model |
CN114543638A (en) * | 2022-01-12 | 2022-05-27 | 四川恒得复生态科技有限公司 | Tool capable of rapidly measuring herbaceous coverage |
US11723298B2 (en) | 2016-09-16 | 2023-08-15 | Basf Agro Trademarks Gmbh | Efficient use of plant protection agents, nutrients, and the like in the growing of cultivated plants |
CN117131441A (en) * | 2023-10-25 | 2023-11-28 | 北京大学深圳研究生院 | Night light pollution monitoring method, device, computer equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5296702A (en) | 1992-07-28 | 1994-03-22 | Patchen California | Structure and method for differentiating one object from another object |
US5585626A (en) | 1992-07-28 | 1996-12-17 | Patchen, Inc. | Apparatus and method for determining a distance to an object in a field for the controlled release of chemicals on plants, weeds, trees or soil and/or guidance of farm vehicles |
US20050098713A1 (en) | 2003-09-23 | 2005-05-12 | Kyle Holland | Light sensor with modulated radiant polychromatic source |
US20120195496A1 (en) * | 2011-01-31 | 2012-08-02 | Zaman Qamar-Uz | Variable rate sprayer system and method of variably applying agrochemicals |
WO2013087052A1 (en) | 2011-12-13 | 2013-06-20 | Yara International Asa | Method and apparatus for contactlessly determining plant parameters and for processing this information |
US20140021267A1 (en) * | 2012-07-23 | 2014-01-23 | Vision Robotics Corporation | System and method for crop thinning with fertilizer |
US8712144B2 (en) * | 2003-04-30 | 2014-04-29 | Deere & Company | System and method for detecting crop rows in an agricultural field |
EP2728308A2 (en) * | 2012-10-31 | 2014-05-07 | Kabushiki Kaisha Topcon | Aerial photogrammetry and aerial photogrammetric system |
-
2015
- 2015-06-17 WO PCT/IB2015/054561 patent/WO2015193822A1/en active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5296702A (en) | 1992-07-28 | 1994-03-22 | Patchen California | Structure and method for differentiating one object from another object |
US5389781A (en) | 1992-07-28 | 1995-02-14 | Patchen California | Structure and method usable for differentiating a plant from soil in a field |
US5585626A (en) | 1992-07-28 | 1996-12-17 | Patchen, Inc. | Apparatus and method for determining a distance to an object in a field for the controlled release of chemicals on plants, weeds, trees or soil and/or guidance of farm vehicles |
US8712144B2 (en) * | 2003-04-30 | 2014-04-29 | Deere & Company | System and method for detecting crop rows in an agricultural field |
US20050098713A1 (en) | 2003-09-23 | 2005-05-12 | Kyle Holland | Light sensor with modulated radiant polychromatic source |
US20120195496A1 (en) * | 2011-01-31 | 2012-08-02 | Zaman Qamar-Uz | Variable rate sprayer system and method of variably applying agrochemicals |
WO2013087052A1 (en) | 2011-12-13 | 2013-06-20 | Yara International Asa | Method and apparatus for contactlessly determining plant parameters and for processing this information |
US20140021267A1 (en) * | 2012-07-23 | 2014-01-23 | Vision Robotics Corporation | System and method for crop thinning with fertilizer |
EP2728308A2 (en) * | 2012-10-31 | 2014-05-07 | Kabushiki Kaisha Topcon | Aerial photogrammetry and aerial photogrammetric system |
Non-Patent Citations (1)
Title |
---|
KNORN J ET AL: "Land cover mapping of large areas using chain classification of neighboring Landsat satellite images", REMOTE SENSING OF ENVIRONMENT, ELSEVIER, XX, vol. 113, no. 5, 15 May 2009 (2009-05-15), pages 957 - 964, XP026051923, ISSN: 0034-4257, [retrieved on 20090223], DOI: 10.1016/J.RSE.2009.01.010 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11723298B2 (en) | 2016-09-16 | 2023-08-15 | Basf Agro Trademarks Gmbh | Efficient use of plant protection agents, nutrients, and the like in the growing of cultivated plants |
CN109788750A (en) * | 2016-09-16 | 2019-05-21 | 巴斯夫农化商标有限公司 | The determination of demand to plant protection product |
WO2018050580A1 (en) * | 2016-09-16 | 2018-03-22 | Bayer Cropscience Aktiengesellschaft | Determination of the requirements on plant protection agents |
US10893669B2 (en) | 2016-09-16 | 2021-01-19 | Basf Agro Trademarks Gmbh | Determination of the requirements on plant protection agents |
US11825835B2 (en) | 2016-09-16 | 2023-11-28 | Basf Agro Trademarks Gmbh | Determination of the requirements of plant protection agents |
WO2018073060A1 (en) | 2016-10-18 | 2018-04-26 | Bayer Cropscience Aktiengesellschaft | Planning and implementing agricultural measures |
US11818975B2 (en) | 2016-10-18 | 2023-11-21 | Basf Agro Trademarks Gmbh | Planning and implementing agricultural measures |
CN111868566A (en) * | 2019-10-11 | 2020-10-30 | 安徽中科智能感知产业技术研究院有限责任公司 | Agricultural machine working area measuring and calculating method based on positioning drift measuring and calculating model |
CN111868566B (en) * | 2019-10-11 | 2023-10-03 | 安徽中科智能感知科技股份有限公司 | Agricultural machinery operation area measuring and calculating method based on positioning drift measuring and calculating model |
WO2021068177A1 (en) * | 2019-10-11 | 2021-04-15 | 安徽中科智能感知产业技术研究院有限责任公司 | Agricultural machinery operation area calculation method based on positioning drift calculation model |
CN114543638A (en) * | 2022-01-12 | 2022-05-27 | 四川恒得复生态科技有限公司 | Tool capable of rapidly measuring herbaceous coverage |
CN117131441A (en) * | 2023-10-25 | 2023-11-28 | 北京大学深圳研究生院 | Night light pollution monitoring method, device, computer equipment and storage medium |
CN117131441B (en) * | 2023-10-25 | 2024-02-13 | 北京大学深圳研究生院 | Night light pollution monitoring method, device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11744189B2 (en) | Plant treatment based on morphological and physiological measurements | |
US11771077B2 (en) | Identifying and avoiding obstructions using depth information in a single image | |
WO2015193822A1 (en) | Method and device for measuring vegetation cover on farmland | |
US9983311B2 (en) | Modular systems and methods for determining crop yields with high resolution geo-referenced sensors | |
US20190162855A1 (en) | Systems and methods for determining crop yields with high resolution geo-referenced sensors | |
US20220330499A1 (en) | System and method for turning irrigation pivots into a soil and plant radar | |
JPWO2019044244A1 (en) | Crop cultivation support device | |
US20220101554A1 (en) | Extracting Feature Values from Point Clouds to Generate Plant Treatments | |
Kavvadias et al. | Precision Agriculture-Comparison and Evaluation of Innovative Very High Resolution (UAV) and LandSat Data. | |
KR102479284B1 (en) | Vegetation index acquisition unit and apparatus for monitoring plant comprising the same | |
Tsoulias et al. | An approach for monitoring temperature on fruit surface by means of thermal point cloud | |
US20220100996A1 (en) | Ground Plane Compensation in Identifying and Treating Plants | |
CN105181632B (en) | NDVI measuring device is imaged in network-type various dimensions plant | |
Zarco-Tejada et al. | New tools and methods in agronomy | |
CN112233121A (en) | Fruit yield estimation method based on binocular space positioning and intelligent segmentation | |
Na et al. | Monitoring onion growth using UAV NDVI and meteorological factors | |
Feng | Quantifying the effect of environments on crop emergence, development and yield using sensing and deep learning techniques | |
Hu et al. | Identifying rice seedling bands based on slope virtualization clustering | |
Dunn et al. | Vision based macadamia yield assessment | |
WO2024069631A1 (en) | Plant phenotyping | |
Hu PengCheng et al. | Estimation of canopy height using an Unmanned Aerial Vehicle in the field during wheat growth season. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15742071 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15742071 Country of ref document: EP Kind code of ref document: A1 |