By Jeff Smyczek
On 30, Sep 2016 | In UAV/drone | By Jeff Smyczek
As part of the Summer Undergraduate Research Fellowship (SURF), I am obligated to present my research at multiple poster sessions throughout the year including the Fall and Spring Undergraduate Research Days. I had never presented at a poster session before and was experiencing the uncertainty that goes along with doing something for the first time. How do I fit a whole summer’s worth of information onto a poster? How are these posters typically formatted? Who will I have to present my research to? Well, with help from the UW-Whitewater Undergraduate Research Program (URP), ideas on these questions started circulating through my head. With help from Dr. Compas, I put together a poster that reflects the progress I have made on my research project and uncertainties were clarified.
By Jeff Smyczek
On 27, Jun 2016 | In UAV/drone | By Jeff Smyczek
After conducting our initial mission, we realized a problem with the images taken from the Tetracam. We were missing images from one of the lenses, which prohibited us from calculating normalized difference vegetation index (NDVI). After trying to resolve the issue with Tetracam support, we had to send the camera back to Tetracam headquarters for repairs. Sending the camera half-way across the country cost us valuable research time as we intended on monitoring vegetation health throughout the growing season.
Once the issues were resolved, I performed pre-flight inspections and took off!
The mission went smooth and I was able to capture our first useful data from the Tetracam. I also collected biomass samples for correlation between data received from the air and live biomass weight once it was dried. I then uploaded the images and began processing in Pixel Wrench, Pix4D, and ArcMap to calculate NDVI from the orthomosaic.
After images are assigned a GPS position, Pix4D is able to locate the images and create a ray cloud shown here:
From the ray cloud, an orthomosaic is created that we can upload to ArcMap and calculate NDVI. NDVI is a measure of live green vegetation in a target and allows us to analyze vegetation health remotely. We are still working on the image calibration procedure, but our preliminary NDVI calculation from the first flight is shown below.
While these accomplishments were significant, we were still seeing artifacts (straight lines and vignetting) in our imagery and resulting orthomosaic that will require further work.
By Jeff Smyczek
On 12, Jun 2016 | In UAV/drone | By Jeff Smyczek
Thanks to the University of Wisconsin-Whitewater Undergraduate Research Program and the Wisconsin Space Grant Consortium, I was able to conduct my first research project as part of the Summer Undergraduate Research Fellowship (SURF) program. I began on a project mapping vegetation health from a small unmanned aerial system (sUAS) or “drone” in a natural restored prairie called Fair Meadows. The purpose of this project is to answer the question: Can multispectral drone imagery be used to accurately monitor vegetation health in a variable natural setting? To attempt answering the research question, Dr. Eric Compas and I embarked on a journey we knew would have its share of problems. Some research has been conducted thus far utilizing new developments in technology, and we hoped to add to the knowledge and use of sUAS.
The first step of the project was to learn and prepare the new Tetracam RGB+3 for flight. The Tetracam RGB+3 is a multispectral camera that has four lenses including broad-band RGB and narrow-band red, red edge, and near infrared (NIR). The narrow bands allow us to generate several vegetation indices such as the normalized vegetation difference index (NDVI). After scouring the manual, we learned best practices for the Tetracam RGB+3 and configured settings that we thought were appropriate for our study. With our newfound knowledge, we started on-the-ground testing, taking pictures of reflectance standards for proper image calibration. With calibration still in the works, I began fabricating a custom mounting system to attach the camera to the drone. We had some significant constraints — we needed to stay within the 3DR Solo’s payload of about 420 grams and the RGB+3 weighs 400 grams. We also had to make sure to not shift the center of gravity of the drone with this significant weight.
Like any engineering, prototypes are necessary and this mounting system was no exception. My first mount held the camera extremely well, so we took it out for a test flight. The mount held up and we conducted our first mission. Upon inspecting the image at the lab, we noticed a slight blurriness and went back to the drawing board. We decided to fabricate a new mount that would isolate vibrations from the drone. After tossing around ideas, we settled on using a bungee cord suspension system (with help from Tetracam). Success! Images from the next flight were smooth and the mount held the camera securely as it traveled 50 meters above the prairie.