dot
Detailansicht
Katalogkarte GBA
Katalogkarte ISBD
Suche präzisieren
Drucken
Download RIS
Hier klicken, um den Treffer aus der Auswahl zu entfernen
Titel Accuracy analysis of direct georeferenced UAV images utilising low-cost navigation sensors
VerfasserIn Christian Briese, Martin Wieser, Geert Verhoeven, Philipp Glira, Michael Doneus, Norbert Pfeifer
Konferenz EGU General Assembly 2014
Medientyp Artikel
Sprache Englisch
Digitales Dokument PDF
Erschienen In: GRA - Volume 16 (2014)
Datensatznummer 250089411
Publikation (Nr.) Volltext-Dokument vorhandenEGU/EGU2014-3611.pdf
 
Zusammenfassung
Unmanned aerial vehicles (UAVs), also known as unmanned airborne systems (UAS) or remotely piloted airborne systems (RPAS), are an established platform for close range airborne photogrammetry. Compared to manned platforms, the acquisition of local remote sensing data by UAVs is a convenient and very flexible option. For the application in photogrammetry UAVs are typically equipped with an autopilot and a lightweight digital camera. The autopilot includes several navigation sensors, which might allow an automated waypoint flight and offer a systematic data acquisition of the object resp. scene of interest. Assuming a sufficient overlap between the captured images, the position (3 coordinates: x, y, z) and the orientation (3 angles: roll, pitch, yaw) of the images can be estimated within a bundle block adjustment. Subsequently, coordinates of observed points that appear in at least two images, can be determined by measuring their image coordinates or a dense surface model can be generated from all acquired images by automated image matching. For the bundle block adjustment approximate values of the position and the orientation of the images are needed. To gather this information, several methods exist. We introduce in this contribution one of them: the direct georeferencing of images by using the navigation sensors (mainly GNSS and INS) of a low-cost on-board autopilot. Beside automated flights, the autopilot offers the possibility to record the position and the orientation of the platform during the flight. These values don’t correspond directly to those of the images. To compute the position and the orientation of the images two requirements must be fulfilled. First the misalignment angles and the positional differences between the camera and the autopilot must be determined (mounting calibration). Second the synchronization between the camera and the autopilot has to be established. Due to the limited accuracy of the navigation sensors, a small number of ground control points should be used to improve the estimated values, especially to decrease the amount of systematic errors. For the bundle block adjustment the calibration of the camera and their temporal stability must be determined additionally. This contribution presents next to the theory a practical study on the accuracy analysis of direct georeferenced UAV imagery by low-cost navigation sensors. The analysis was carried out within the research project ARAP (automated (ortho)rectification of archaeological aerial photographs). The utilized UAS consists of the airplane “MAJA”, manufactured by “Bormatec” (length: 1.2 m, wingspan: 2.2 m) equipped with the autopilot “ArduPilot Mega 2.5”. For image acquisition the camera “Ricoh GR Digital IV” is utilised. The autopilot includes a GNSS receiver capable of DGPS (EGNOS), an inertial measurement system (INS), a barometer, and a magnetometer. In the study the achieved accuracies for the estimated position and orientation of the images are presented. The paper concludes with a summary of the remaining error sources and their possible corrections by applying further improvements on the utilised equipment and the direct georeferencing process.