Incorporating Hytools into the current image processing pipeline to produce better vegetation maps that will account for radiometric signals and will parallelize workflow
Project Information
big-data, gis, hpc-operations, image-processing, python, rProject Status: Complete
Project Region: Northeast
Submitted By: Larry Whitsel
Project Email: peter.nelson@maine.edu
Project Institution: University of Maine at Fort Kent
Anchor Institution: NE-University of Maine
Project Address: Cyr Hall, University of Maine at Fort Kent
23 University Drive
Fort Kent, Maine. 04743
Mentors: Larry Whitsel
Students: Tolu Oyeniyi
Project Description
The ability to make use of remote sensing data is of particular interest to the State of Maine, given its large and remote forestry and agricultural resources. Such data occurs at several, vastly different, scales - from satellite imagery all the way down to manual inspections of vegetation. A multi-institution research team led by faculty at the University of Maine at Fort Kent uses aerial drone imagery and technologies referred to as “hyperspectral” cameras or scanners, to identify the species and condition of ground cover across a sizable area of interest. Underlying these technologies is the assumption that each material or target has a unique spectral profile that allows it to be told apart from similar co-occurring targets. The sensors detect dozens or hundreds of spectra in the visible and near infrared red (compared to RGB in a normal camera), which allows for better detection of different targets, including plants, plant stress, chemical signatures of rocks and many other attributes.This project enlists a student worker to begin the processes of analyzing and incorporating the hyperspectral image processing pipeline, HyTools, on our cyberinfrastructure to function for our current data. The data to be analyzed includes over 100 Tb of hyperspectral images collected by Unoccupied Aerial Vehicles (UAVs). Configuring HyTools would occur on the Advanced Computing Group (ACG) server cluster. The result will be more useful maps that account for radiometric signals found in the data.
Project Information
big-data, gis, hpc-operations, image-processing, python, rProject Status: Complete
Project Region: Northeast
Submitted By: Larry Whitsel
Project Email: peter.nelson@maine.edu
Project Institution: University of Maine at Fort Kent
Anchor Institution: NE-University of Maine
Project Address: Cyr Hall, University of Maine at Fort Kent
23 University Drive
Fort Kent, Maine. 04743
Mentors: Larry Whitsel
Students: Tolu Oyeniyi
Project Description
The ability to make use of remote sensing data is of particular interest to the State of Maine, given its large and remote forestry and agricultural resources. Such data occurs at several, vastly different, scales - from satellite imagery all the way down to manual inspections of vegetation. A multi-institution research team led by faculty at the University of Maine at Fort Kent uses aerial drone imagery and technologies referred to as “hyperspectral” cameras or scanners, to identify the species and condition of ground cover across a sizable area of interest. Underlying these technologies is the assumption that each material or target has a unique spectral profile that allows it to be told apart from similar co-occurring targets. The sensors detect dozens or hundreds of spectra in the visible and near infrared red (compared to RGB in a normal camera), which allows for better detection of different targets, including plants, plant stress, chemical signatures of rocks and many other attributes.This project enlists a student worker to begin the processes of analyzing and incorporating the hyperspectral image processing pipeline, HyTools, on our cyberinfrastructure to function for our current data. The data to be analyzed includes over 100 Tb of hyperspectral images collected by Unoccupied Aerial Vehicles (UAVs). Configuring HyTools would occur on the Advanced Computing Group (ACG) server cluster. The result will be more useful maps that account for radiometric signals found in the data.