Supervised Classiﬁcation of RGB Aerial Imagery to Evaluate the Impact of a Root Rot Disease

: Aerial imaging provides a landscape view of crop ﬁelds that can be utilized to monitor plant diseases. Phymatotrichopsis root rot (PRR) is a serious root rot disease affecting several dicotyledonous hosts, including the perennial forage crop alfalfa. PRR disease causes stand loss by spreading as circular to irregular diseased areas that increase over time, but disease progression in alfalfa ﬁelds is poorly understood. The objectives of this study were to develop a workﬂow to produce PRR disease maps from sets of high-resolution red, green and blue (RGB) images acquired from two different platforms and to assess the feasibility of using these PRR disease maps to monitor disease progression in alfalfa ﬁelds. Aerial RGB images, two from unmanned aircraft systems (UAS) and four images from a manned aircraft platform were acquired at different time points during the 2014–2015 growing seasons from a center-pivot irrigated, PRR-infested alfalfa ﬁeld near Burneyville, OK. Supervised classiﬁcation of images acquired from both platforms were performed using three spectral signatures: image-speciﬁc, UAS-platform-speciﬁc and manned-aircraft platform-speciﬁc. Our results showed that the UAS-platform-speciﬁc spectral signature was most efﬁcient for classifying images acquired with the UAS, with accuracy ranging from 90 to 96%. In contrast, manned-aircraft-acquired images classiﬁed using image-speciﬁc spectral signatures yielded 95 to 100% accuracy. The effect of hue, saturation and value color space transformations (HSV and Hrot60SV) on classiﬁcation accuracy was determined, but the accuracy estimates showed no improvement in their efﬁciency compared to the RGB color space. Finally, the data showed that the classiﬁcation of the bare ground increased by 74% during the study period, indicating the extent of alfalfa stand loss caused by PRR disease. Thus, this study showed the utility of high-resolution RGB aerial images for monitoring PRR disease spread in alfalfa.


Introduction
The bird's-eye view perspective that aerial imagery provides of target areas offers an objective assessment and cost-effective method to monitor large areas compared to ground-based scouting [1]. With recent technological advances, high resolution aerial images can be acquired by coupling different kinds of sensors [2,3] to a variety of platforms, including satellites, manned aircrafts and unmanned aircraft systems. Several studies have utilized aerial images to monitor plant health in agricultural and forest lands [1,4,5]. While the aerial images obtained from sensors differ with sensor type and resolution, they all are derived using a variable wavelength of light energy that is distinctly recorded by the sensor aboard a platform. Furthermore, the utility of various imaging platforms also largely Remote Sens. 2018, 10,917 3 of 17 In the current study, high-resolution red/green/blue (RGB) aerial images of a PRR-infested alfalfa field were obtained at different times during two crop growing seasons using either manned or unmanned aircraft platforms with the following objectives: (1) develop a workflow to produce PRR disease maps from sets of high-resolution RGB images acquired from two different platforms; and (2) assess the feasibility of using these PRR disease maps to monitor disease progression in alfalfa fields.

Study Site
The study was conducted on a PRR-infested 24.8 ha semi-circular alfalfa commercial hay production field under a center-pivot irrigation system located at the Noble Research Institute's Red River Farm, Burneyville, Oklahoma (Figure 1; 33 • 52 35 N, 97 • 15 28 W, 210 m elevation). The study site was previously a pecan orchard that was converted in 2005 into a production field. Soybeans, rye, wheat, triticale and oats were grown on this site before planting with America's Alfalfa Alfagraze 600 RR in the autumn of 2011. During May and June of 2015, this location received 856 mm of rainfall (nearly 90% of the average annual precipitation) and the Red River escaped its banks flooding portions of the study site, which resulted in a loss of 3.6 ha of alfalfa from the field.

Study Site
The study was conducted on a PRR-infested 24.8 ha semi-circular alfalfa commercial hay production field under a center-pivot irrigation system located at the Noble Research Institute's Red River Farm, Burneyville, Oklahoma (Figure 1; 33°52′35″N, 97°15′28″W, 210 m elevation). The study site was previously a pecan orchard that was converted in 2005 into a production field. Soybeans, rye, wheat, triticale and oats were grown on this site before planting with America's Alfalfa Alfagraze 600 RR in the autumn of 2011. During May and June of 2015, this location received 856 mm of rainfall (nearly 90% of the average annual precipitation) and the Red River escaped its banks flooding portions of the study site, which resulted in a loss of 3.6 ha of alfalfa from the field.

Data Collection and Processing
Two different imaging platforms (Supplementary Figure S1) were utilized to acquire a total of six aerial images from the study site at different solar times during the growing seasons of 2014 and 2015 (Table 1). A Vireo fixed-wing unmanned aircraft system (UAS) with a 10 megapixel (MP) RGB camera was flown in June and August 2014 by the Farm Intelligence Company (USA). The flights occurred at an altitude of 120 m above ground level (AGL) between 11:00 to 15:00 (local time) and the conditions were sunny to mostly sunny.
In October 2014 and during the 2015 growing season, aerial imagery was obtained by CloudStreet AirBorne Survey (USA) using a 22 MP Canon EOS 5D Mark III mounted on a piloted Dragonfly sport utility aircraft flown at 300 m AGL. An aerial survey system (Track' Air, Hengelo, The Netherlands) was used to automatically trigger the camera as the pilot maneuvered the aircraft over each point in the flight plan grid. A laser range finder used as an altimeter recorded the aircraft's altitude AGL at high frequency. A geographic positioning system (GPS) receiver recorded the coordinates of the aircraft. Time stamps logged

Data Collection and Processing
Two different imaging platforms (Supplementary Figure S1) were utilized to acquire a total of six aerial images from the study site at different solar times during the growing seasons of 2014 and 2015 (Table 1). A Vireo fixed-wing unmanned aircraft system (UAS) with a 10 megapixel (MP) RGB camera was flown in June and August 2014 by the Farm Intelligence Company (USA). The flights occurred at an altitude of 120 m above ground level (AGL) between 11:00 to 15:00 (local time) and the conditions were sunny to mostly sunny. In October 2014 and during the 2015 growing season, aerial imagery was obtained by CloudStreet AirBorne Survey (USA) using a 22 MP Canon EOS 5D Mark III mounted on a piloted Dragonfly sport utility aircraft flown at 300 m AGL. An aerial survey system (Track' Air, Hengelo, The Netherlands) was used to automatically trigger the camera as the pilot maneuvered the aircraft over each point in the flight plan grid. A laser range finder used as an altimeter recorded the aircraft's altitude AGL at high frequency. A geographic positioning system (GPS) receiver recorded the coordinates of the aircraft. Time stamps logged from the altimeter and GPS were matched with the trigger time logged by the camera to determine the altitude and location of the aircraft at the time the image was captured [24][25][26].
Images acquired from the manned aircraft platform were converted from the RAW file format to the TIFF file format. The UAS images were collected in JPEG format and were not converted. Irrespective of the image source, the images were loaded into the Agisoft PhotoScan software. The images underwent a series of workflow steps that included image alignment, building dense cloud, developing mesh and creating an orthomosaic, as described in the software's user manual. The final orthomosaic images were brought into the ArcMap software package 10.3.1 for additional computation. A minimum of 24 white reflective 0.09 m 2 metal square plates were fixed on the ground at known locations around and within the field. These plates could be manually identified in the aerial images and served as ground control points for image registration. In order to maintain the map projection and accuracy, the images were geo-rectified with a spline transformation and projected to the Universal Transverse Mercator (UTM), World Geodetic Survey 1984 (WGS-84), Zone 14 North coordinate system. The polygon selection tool was then used to delineate the flooded section and field boundary from all six aerial images.
The RGB images collected using manned and unmanned aircrafts had a ground sample distance that ranged from 0.018 m to 0.064 m (Table 1). To ensure standardization while analyzing images across different time intervals, all the images were resampled to a coarser resolution of 0.10 m. Image resampling was performed with the 'resample' tool using the nearest neighbor assignment resampling technique.

Image Classification
PRR has a distinctive disease pattern that is discernible from the damage caused by other pests and abiotic factors [9,16]. Multispectral or hyperspectral images make it possible to detect the condition of diseased plants that are not visible to the human eye and also capable of identifying plant stress or the severity of damage. However, our datasets include only the visible light bands R, G, and B in images that could not differentiate those nuanced differences, as our objective was to monitor PRR disease spread rather than disease identification. Ground truthing revealed PRR as the dominant factor contributing to alfalfa stand loss at the study site during the 2014-2015 period. The study site was therefore categorized into two classes: alfalfa and bare ground (treated as a 'soil' class while performing supervised image classification). However, during the course of the study, weeds emerged in some of the bare portions of the diseased areas. Hence, weed was also included as a third class in the image classification process ( Figure 2). platform-specific spectral signatures were developed based on the August 2014 and September 2015 RGB images, respectively. Therefore, for the August 2014 image, the image-specific and UAS platform-specific spectral signatures were the same. Similarly for the September 2015 image, the image-specific and mannedaircraft-platform-specific spectral signatures were identical. Our choice of selecting these particular RGB images to develop platform-specific images was based on the fact that they were taken during the midgrowing season, a time period when PRR disease symptoms are pronounced in the field. The non-availability of ground reference data limited our ability to generate training and validation data sets, which were required to perform supervised classification. As an alternative approach, a researcher experienced with field knowledge of PRR disease at this site visually inspected the georectified resampled image to generate training and validation data. In silico identification of one hundred polygons encompassing all three classes (alfalfa (40), soil (40), and weed (20)) that were uniformly spread throughout the study area were selected for each image (Figure 3). Of these, 60 (alfalfa (25), soil (25), and weed (10)) were randomly selected as training datasets and the remaining Remote Sens. 2018, 10, 917 6 of 17 40 (alfalfa (15), soil (15), and weed (10)) served for validation purposes. This ensured that the training and validation datasets are independent of each other. The total number of pixels utilized to generate the training and validation datasets are enumerated in Table 2. Each image (except the August 2014 and September 2015 images) was classified using one of three spectral signatures from the training samples that are specific to: (a) the image; (b) the UAS platform; and (c) the manned aircraft platform. The UAS platform-specific and manned aircraft platform-specific spectral signatures were developed based on the August 2014 and September 2015 RGB images, respectively. Therefore, for the August 2014 image, the image-specific and UAS platform-specific spectral signatures were the same. Similarly for the September 2015 image, the image-specific and manned-aircraft-platform-specific spectral signatures were identical. Our choice of selecting these particular RGB images to develop platform-specific images was based on the fact that they were taken during the mid-growing season, a time period when PRR disease symptoms are pronounced in the field.    A color model conversion function was employed in the ArcMap software to convert the aerial images from RGB to hue, saturation and value (HSV) color space. Unlike RGB, the HSV color space channels are less correlated with each other. This conversion sets the hue, saturation, and value channel values between 0 and 240, 0 and 255, and 0 and 255, respectively. Analysis of the spectral signatures generated with HSV images revealed that the hue channel provided greater contrast between the three classes in the study, but with higher variation than the saturation and value channels. Since hue is expressed as a polar dimension with a red hue mapped to values near 240 and 0, we rotated the hue values 60 units to produce a new Hrot60SV image by adding 60 units to the hue channel values less than 180 and subtracting 180 from values between 240 and 180 (e.g., 0 and 240 becomes 60 and 180 becomes 0). This conversion was performed in the R software version 3.4.2 using a raster package. Thus, for each aerial image we had three variants to evaluate: RGB, HSV and Hrot60SV ( Figure 4). Based on the spectral signatures unique to each variant image, maximum likelihood supervised classification was performed, categorizing each image into three classes (alfalfa, soil and weed). The maximum likelihood classification algorithm assigned a pixel to a user-defined class based on Bayes' theorem of decision-making.  A color model conversion function was employed in the ArcMap software to convert the aerial images from RGB to hue, saturation and value (HSV) color space. Unlike RGB, the HSV color space channels are less correlated with each other. This conversion sets the hue, saturation, and value channel values between 0 and 240, 0 and 255, and 0 and 255, respectively. Analysis of the spectral signatures generated with HSV images revealed that the hue channel provided greater contrast between the three classes in the study, but with higher variation than the saturation and value channels. Since hue is expressed as a polar dimension with a red hue mapped to values near 240 and 0, we rotated the hue values 60 units to produce a new Hrot60SV image by adding 60 units to the hue channel values Remote Sens. 2018, 10, 917 7 of 17 less than 180 and subtracting 180 from values between 240 and 180 (e.g., 0 and 240 becomes 60 and 180 becomes 0). This conversion was performed in the R software version 3.4.2 using a raster package. Thus, for each aerial image we had three variants to evaluate: RGB, HSV and Hrot60SV ( Figure 4). Based on the spectral signatures unique to each variant image, maximum likelihood supervised classification was performed, categorizing each image into three classes (alfalfa, soil and weed). The maximum likelihood classification algorithm assigned a pixel to a user-defined class based on Bayes' theorem of decision-making.

Model Accuracy
An accuracy assessment of all the classified images was performed using 40 validation polygons that were generated for each image, as mentioned above. One pixel was randomly selected from each validation polygon and compared with the corresponding pixel class in the classified image. This iterative process was performed 1000 times to compute the mean overall accuracy and balanced accuracy values for the alfalfa, soil and weed pixel classes. As the weed class was underrepresented compared to the alfalfa and soil classes, balanced accuracy values were estimated for each class, thus accounting for imbalance datasets. All analyses were performed in the R software, version 3.4.2, using the raster and caret packages [27,28].

Agreement between Two Classified Images
Comparison of the classified images generated for an RGB aerial image using two different spectral signatures was performed by pairing all pixels in the three classes from both images, thereby creating nine classes (alfalfa-alfalfa, soil-soil, weed-weed, alfalfa-soil, alfalfa-weed, soil-alfalfa, soil-weed, weed-alfalfa and weed-soil). The consistent class pairs where both classified pixels agreed (alfalfa-alfalfa, soil-soil, weed-weed) were not considered for further analysis. To estimate the true accuracy of the other six inconsistent class pairs in the remaining set of pixels, 20 random pixels were sampled in a stratified random manner from each class pair, and we visually inspected the pixels in the RGB image to manually classify the pixel. All analyses were performed in the R software, version 3.4.2, using the raster package.

Post-Processing of Classified Images
The visual observation of the classified images showed many misclassified isolated pixels. To remove this noise, the images underwent a series of post-classification processing steps: filtering to remove isolated

Model Accuracy
An accuracy assessment of all the classified images was performed using 40 validation polygons that were generated for each image, as mentioned above. One pixel was randomly selected from each validation polygon and compared with the corresponding pixel class in the classified image. This iterative process was performed 1000 times to compute the mean overall accuracy and balanced accuracy values for the alfalfa, soil and weed pixel classes. As the weed class was underrepresented compared to the alfalfa and soil classes, balanced accuracy values were estimated for each class, thus accounting for imbalance datasets. All analyses were performed in the R software, version 3.4.2, using the raster and caret packages [27,28].

Agreement between Two Classified Images
Comparison of the classified images generated for an RGB aerial image using two different spectral signatures was performed by pairing all pixels in the three classes from both images, thereby creating nine classes (alfalfa-alfalfa, soil-soil, weed-weed, alfalfa-soil, alfalfa-weed, soil-alfalfa, soil-weed, weed-alfalfa and weed-soil). The consistent class pairs where both classified pixels agreed (alfalfa-alfalfa, soil-soil, weed-weed) were not considered for further analysis. To estimate the true accuracy of the other six inconsistent class pairs in the remaining set of pixels, 20 random pixels were Remote Sens. 2018, 10, 917 8 of 17 sampled in a stratified random manner from each class pair, and we visually inspected the pixels in the RGB image to manually classify the pixel. All analyses were performed in the R software, version 3.4.2, using the raster package.

Post-Processing of Classified Images
The visual observation of the classified images showed many misclassified isolated pixels. To remove this noise, the images underwent a series of post-classification processing steps: filtering to remove isolated pixels from the classified images, smoothing class boundaries and reclassifying small isolated regions (pixel count less than 100) to the closest surrounding cell values. All these steps were accomplished using generalization tools (majority filter, boundary clean, region group, set null and nibble tools) listed in the ArcMap software package 10.3.1. After performing post-classification processing, the number of pixels that belonged to each class was calculated for each image to assess the area of alfalfa stand loss that occurred due to PRR disease.

Results
In total, two (June 2014 and August 2014) and four (October 2014, June 2015, September 2015 and October 2015) RGB datasets were collected using UAS and manned aircraft, respectively, from the study area during the 2014-2015 growing seasons ( Table 1). As the ground sample distance for images acquired from both platforms were different, all the images were resampled to 10 cm resolution ensuring consistent comparison among the images. Visual inspection of the aerial images showed expanding circular to irregular PRR disease circles with asymptomatic plants outside the circle along with survivors and in some instances weeds occupying the bare ground areas inside the disease circle.

UAS-Acquired Images
The accuracy assessment estimates for the supervised classification of the June 2014 and August 2014 RGB images using different spectral signatures are summarized in Table 3. The overall accuracy for the June 2014 classified image acquired using the UAS platform ranged from 0.508 for the manned aircraft platform-specific spectral signature to 0.968 for the image-specific spectral signature. It is interesting to note that the UAS platform-specific spectral signature (developed based on the August 2014 image) when applied to the June 2014 image yielded an accuracy of 0.901, comparable to the accuracy values developed with the image-specific spectral signature. Likewise, the overall accuracy for the August 2014 RGB image classified with the UAS platform-specific spectral signature (which happens also to be image-specific) and the manned-aircraft-platform-specific spectral signature was 0.896 and 0.584, respectively. Balanced accuracy assessments estimated for the soil class showed accuracy values higher than 0.86 regardless of the spectral signature for both the June and August 2014 images (Table 3). However, classification with the manned aircraft platform-specific spectral signature resulted in only a small number of pixels being assigned to the alfalfa class. This resulted in lower accuracy values of 0.50 for the alfalfa class for both UAS platform-acquired images. Similar trends were observed with the HSV and Hrot60SV variants of the June and August 2014 images (Supplementary Tables S1 and S2). The data clearly indicated that the manned aircraft platform-specific spectral signature cannot be employed to classify images acquired using the UAS platform. We further analyzed the congruency between the June 2014 RGB image classified using the image-specific spectral signature and the UAS platform-specific spectral signature (the overall mean accuracies were 0.968 and 0.901, respectively). For this purpose, we compared the class assigned to each pixel from both classified images, which resulted in the generation of nine class pairs, as described in Table 4. About 76% of the pixels from both classified images had assigned all three classes (alfalfa, soil and weed) the same. The remaining 24% of pixels did not match between the classified images. A weight factor was calculated for each inconsistent class pair based on the percentage of pixels (Tables 4 and 5). We then randomly selected 20 pixels from the set of pixels in each inconsistent class pair (alfalfa-soil, alfalfa-weed, soil-alfalfa, soil-weed, weed-alfalfa, weed-soil), and manually classified the pixel by visually evaluating the corresponding pixel in the June 2014 RGB image. The results are presented as a percentage of pixels within each pair classified as alfalfa, soil, or weed ( Table 5). Overall congruency was determined by multiplying the percentage matching the manual classification by the weight factor and summing over the classes. The results in Table 5 indicate that the image classified using the UAS platform-specific spectral signature had more class pairs (classes alfalfa-soil, soil-alfalfa, weed-alfalfa, weed-soil) closely representing the manually classified RGB image, compared to the image classified using the image-specific spectral signature (classes alfalfa-weed and soil-weed). Table 4. Agreement between the June 2014 RGB (red, green, blue) image classified using the image-specific spectral signature and the June 2014 RGB image classified using UAS platform-specific spectral signature.   Table 4. Y The percentage of pixels was calculated by randomly selecting 20 pixels from the set of pixels in each class pair, which were manually classified by visually evaluating the corresponding pixel in the June 2014 RGB (red, green, blue) image. Z The overall congruency was determined by multiplying the percentage matching the manual classification by the weight factor and summing over the classes.

Class Pair (Image Specific-UAS Platform Specific) Percentage of Pixels Weight Factor Z
Further, the sum across classes of their correct weight-factor-adjusted percentage was 47.70 and 41.10, respectively. These data suggest that although the overall accuracy for the image classified using Remote Sens. 2018, 10, 917 11 of 17 the image-specific spectral signature (0.968, Table 3) was higher than the image classified using the UAS platform-specific spectral signature (0.901, Table 3), the pixels from the latter classified image more closely resembled the manually classified pixels when the supervised classification disagreed. Therefore, the best spectral signature for classifying the June 2014 RGB image was with the UAS platform-specific signature.

Manned-Aircraft-Acquired Images
Accuracy assessment estimates for the supervised classified images from October, 2014, June, September and October, 2015 RGB images acquired by a manned aircraft using different spectral signatures are presented in Table 3. The overall accuracy for these four classified images using image-specific spectral signatures ranged from 0.915 for September, 2015 to 0.981 for the June, 2015 image. The spectral signatures developed based on the UAS platform were not able to classify any of the images taken by manned aircraft, as evidenced by accuracy estimates ranging from 0.405 to 0.56. With regards to the classification based on the manned-aircraft-platform-specific spectral signature, the overall mean accuracy estimates ranged from 0.687 to 0.903, indicating that the utility of the manned-aircraft-specific spectral signature is image specific.
Balanced accuracy assessments for soil class for all four images classified using the image-specific and manned-aircraft-platform-specific spectral signatures showed accuracy values higher than 0.93 (Table 3). In contrast, balanced accuracy estimates for the alfalfa and weed classes were lower (ranging between 0.5 and 0.731) for all four images when classified using the UAS platform-specific spectral signature. Similar trends were also observed for the HSV and Hrot60SV variants for all four images acquired by manned aircraft (Supplementary Tables S1 and S2), indicating that the spectral signature developed based on the UAS-acquired images cannot be employed to classify images acquired using manned aircraft. In addition, image-specific spectral signatures provided the greatest accuracy for the classification of images acquired by manned aircraft.

Effects of Post-Processing on Image Accuracy
Our data showed a minimal effect of HSV and Hrot60SV transformation on improving the classification accuracy of the images. Therefore, we continued our analysis with RGB images, thereby reducing the time and resources spent during the image transformation process. The best spectral signatures for classifying RGB aerial images were determined to be as follows: UAS platform-specific spectral signature for images acquired by UAS, and the image-specific spectral signature for manned-aircraft-acquired images. However, upon visual observation these classified images showed many misclassified isolated pixels, creating a speckled appearance. We then performed post-classification processing steps to diminish this effect and estimated the accuracy values of the images, as outlined in Table 6. Apart from the June 2014 image, post-classification processing either maintained or improved the overall accuracy estimates for all the datasets. With respect to the June 2014 image, the overall accuracy slightly dropped from 0.901 (Table 3) to 0.898 ( Table 6). The June and September 2015 post-processing classified images resulted in 100 percent accuracy.

Effect of PRR Disease on Alfalfa Stand
We have further determined the effect of PRR disease on alfalfa stands from the post-processing classified images. Based on the classified image datasets from Table 6, the area covered by the alfalfa, soil and weed classes was determined and the results are presented in Figure 5. The area under the alfalfa, soil and weed classes from the June 2014 image were estimated to be 13.7, 5.8, and 1.7 ha, respectively. Although, it has to be noted that the accuracy of the alfalfa, soil and weed areas was 89.7%, 99.8% and 78% for each class, respectively (Table 6). Intuitively, as the season progresses one might expect a reduction in the alfalfa stand and a corresponding increase in bare ground due to PRR disease. By the end of crop season in October 2015, the areas under alfalfa and soil have changed dramatically. PRR disease caused a reduction of 31.4% in the alfalfa stand between June 2014 and October 2015, with the bare ground increasing by 74% during this period ( Figure 5). classified images. Based on the classified image datasets from Table 6, the area covered by the alfalfa, soil and weed classes was determined and the results are presented in Figure 5. The area under the alfalfa, soil and weed classes from the June 2014 image were estimated to be 13.7, 5.8, and 1.7 ha, respectively. Although, it has to be noted that the accuracy of the alfalfa, soil and weed areas was 89.7%, 99.8% and 78% for each class, respectively (Table 6). Intuitively, as the season progresses one might expect a reduction in the alfalfa stand and a corresponding increase in bare ground due to PRR disease. By the end of crop season in October 2015, the areas under alfalfa and soil have changed dramatically. PRR disease caused a reduction of 31.4% in the alfalfa stand between June 2014 and October 2015, with the bare ground increasing by 74% during this period ( Figure 5).

Discussion
We applied an aerial imaging approach to better understand PRR disease spread and map PRR-infested areas in an alfalfa field. Earlier research on PRR disease has focused on multispectral or hyperspectral aerial images acquired using a manned aircraft platform; these studies were conducted in cotton, another important host for P. omnivora [15,17,29]. To our knowledge, this is the first study that used multiple aerial high-resolution RGB images of a PRR-infested alfalfa field, spanning a period of two growing seasons (2014)(2015). Unlike cotton, alfalfa is a perennial forage crop and is cut several times within a growing season, thereby providing a unique opportunity to study PRR disease progression under such intense management practices. In addition, continuous host availability over different years influences pathogen movement and survival, thereby affecting stand yields. We therefore used RGB images, as we were interested in monitoring PRR disease spread rather than PRR detection, in which case we would have resorted to using multi-or hyper-spectral sensors capable of capturing different spectra not perceived by the human eye. The major focus of this study was to develop a workflow for analyzing RGB images collected using UAS and manned aircraft platforms, and to discern the utility of these data for the study of PRR disease progression in alfalfa.
Regardless of the platform, remote sensing images are subject to optical and perspective distortions that arise during the process of image acquisition and representing a three-dimensional image into a two-dimensional format. Previous research has indicated that the usage of small focal length sensors (28 mm) produces more such distortions and that these distortions may have to be corrected during processing in order to measure geometric quantities correctly [30]. In the current study, the UAS image dataset was developed using vendor-generated proprietary jpeg images without any change. For the images acquired with the manned aircraft platform, we used the Unidentified Flying Raw (UFRaw) application to convert the RAW file format to the TIFF file format, with the camera white balance option selected for color corrections. As earlier research has indicated higher perspective distortion with small focal length (50 mm) sensors compared to 85 or 105 mm sensors, the current study's image dataset collected with the manned aircraft used an 85 mm focal length lens to yield minimum optical and perspective distortions [31].
Initially, we examined the possibility of developing a single spectral signature to classify all the images that would reduce the processing time and computing resources. Our data indicated that a single spectral signature is not applicable for the analysis of images collected from the UAS and manned aircraft platforms. Hence, each image was classified using three spectral signatures: (a) image-specific; (b) UAS platform-specific; and (c) manned aircraft platform-specific. We identified that the UAS platform-specific spectral signature can be used to classify images acquired from a UAS platform, yielding higher accuracy results (Table 3, Supplementary Tables S1 and S2). However, the same model does not hold for manned aircraft images. The data from Table 3 and Supplementary Tables S1 and S2 indicated increased accuracy estimates when the manned aircraft images were classified using image-specific spectral signatures.
The acquisition of RGB images using different sensors at various flight times (Table 1) might be some of the reasons that a single spectral signature could not be utilized for images from both platforms. The images obtained also had different spectral resolutions; to balance this effect, all images were resampled to a coarser resolution (0.10 m), ensuring uniform comparison. The flights for both platforms occurred at different times during the day resulting in images with varying degrees of lighting in addition to shadowing effects. To account for differences in luminance between the images and to accurately segment pixels into the classes alfalfa, soil and weed, we investigated the utility of color space conversion for our dataset.
The images from our study were converted to the traditional RGB (red, green, blue channels) color space. Several studies have shown a better separation of the image features by weighting each channel differently during the process of transforming the RGB color space. Such changes are expected to yield diverse color distributions in each model, as most of the transformations are non-linear [32,33]. For example, in a study where 11 different color spaces (RGB, normalized rgb, XYZ, L*a*b*, L*u*v*, HSV, HLS, YCrCb, YUV, I1I2I3 and TSL) were compared to segment lettuce plants and soil from a set of images, the L*a*b* color space was shown to achieve superior classification, with 99.2% accuracy [34]. While there is no single optimum color space for any image classification, we chose to transform our RGB dataset into an HSV color model, as this model has been shown to be robust to illumination variations and removing shadow effects [35][36][37]. After performing HSV transformation, we observed high variation in the pixel values of the hue channel, especially for the soil class, since the soils have a reddish hue with values just above 0 and just below 240. To minimize this variation, the hue channel pixel values were rotated by 60 units to yield a Hrot60SV transformation.
We expected improved classification accuracies with the HSV and Hrot60SV color spaces compared to the RGB color space (Table 3, Supplementary Tables S1 and S2), but failed to find support for this hypothesis in the data and subsequently rejected this hypothesis. Although the balanced accuracy estimates for the soil class were more than 92% for the RGB color space images classified using different spectral signatures (the UAS-platform-based signature for UAS-acquired images and the image-specific spectral signature for manned-aircraft-acquired images), a major factor contributing to the differences in the overall accuracy estimates appears to be the alfalfa and weed classes. In addition to color space conversions, we investigated the effect of the spectral angle mapper (SAM) algorithm when performing supervised classification on a subset of our dataset. Unlike the maximum likelihood classification algorithm that was used to perform supervised classification in this study, SAM does not require any assumptions regarding the statistical distribution of the data and is not affected by solar illumination and shading effects [38]. However, the SAM classification resulted in accuracies of less than 85% compared to the maximum likelihood classification algorithm (data not shown). This is in contrast to Yang et al. [16], who detected PRR in cotton fields using SAM with more than 95% accuracy, but their dataset was comprised of multispectral images, unlike the RGB images that were used in the current study. While it is a recognized challenge to accurately segment crops from weeds, other studies have mitigated this hurdle by using artificial neural network algorithms and other machine-learning-based approaches; assessing their utility for our dataset is beyond the scope of this study [39,40].
Post-classification processing steps were performed to minimize the speckled effect in the image, i.e., for removing misclassified isolated pixels and small regions less than 100 pixels. This process either maintained or improved the overall accuracy for five of the six images analyzed ( Table 6). For June 2014, the overall mean accuracy decreased from 0.901 to 0.898. This is also the image for which we determined that using the UAS-platform-specific spectral signature was better than the image-specific spectral signature. We estimated the congruency between the classified images generated by both spectral signatures, compared with the manually classified RGB pixels and determined UAS-platform-specific signature to be more accurate than the image-specific spectral signature. We observed lower balanced accuracy values for the weed class for this image. This is the result of significantly fewer weeds, leading to the generation of smaller training and validation datasets ( Table 2). We hypothesize that fewer training pixels might have an influence on the misclassification of some of the pixels as falling in the weed class instead of the alfalfa class or vice-versa.
The classified RGB aerial images estimating the extent of alfalfa stand loss due to PRR showed wide fluctuations in the area under the three classes, especially the soil class. More weeds occupied the new empty soil areas created by the PRR disease, leading to a reduction in the area under soil. However, by the end of the study period (October 2015), about 10 ha of area was recorded for the soil class, compared to 5.8 ha at the start of the study period (June 2014). This can be attributed to the formation of new diseased areas as well as the increase in existing diseased areas. Similar effects of PRR have been observed in one cotton growing season using multispectral images, where the percentage of root-rot-infected areas increased from 5.4% to 13.2% and 21.6% to 26.8% in two fields in Edroy, TX and from 27.0% to 37.8% and 21.4% to 50.6% in two fields in San Angelo, TX [29].

Conclusions
To summarize, Phymatotrichopsis root rot disease severely limits alfalfa production in southern Oklahoma. Alfalfa fields infested with P. omnivora often reduce stand life and productivity. The extent of disease spread occurring in a growing season greatly affects alfalfa stand longevity, butlittle is understood about this phenomenon. Hence, through this study we provide a framework for obtaining high-resolution RGB aerial images from either UAS or manned aircraft platforms and the subsequent workflow to deduce the extent of PRR disease spread. Understanding the loss of alfalfa stand areas reported from aerial images could help a producer make informed management choices such as replanting or site-specific fungicide application to slow down the spread of the disease.
Supplementary Materials: The following are available online at http://www.mdpi.com/2072-4292/10/6/917/s1, Figure S1: Aerial imaging platforms used for data collection from a Phymatotrichopsis root-rot-infested alfalfa hay production field during 2014 and 2015. (A) Vireo unmanned aerial vehicle, (B) Dragonfly sport utility piloted aircraft; Table S1: Accuracy assessment values of HSV images using different spectral signatures; Table S2: Accuracy assessment values of Hrot60SV images using different spectral signatures.