Projecte llegit
Títol: Reconocimiento de objetos para el control de drones en el Drone Engineering Ecosystem
Estudiants que han llegit aquest projecte:
- MARCOS I PAYA, POL (data lectura: 12-09-2024)
- Cerca aquest projecte a Bibliotècnica
Director/a: SALAMÍ SAN JUAN, ESTHER
Departament: DAC
Títol: Reconocimiento de objetos para el control de drones en el Drone Engineering Ecosystem
Data inici oferta: 29-01-2024 Data finalització oferta: 29-09-2024
Estudis d'assignació del projecte:
- GR ENG SIS TELECOMUN
- GR ENG TELEMÀTICA
Tipus: Individual | |
Lloc de realització: EETAC | |
Institució/Empresa: Technical University of Catalonia (UPC) | |
Segon director/a (UPC): VALERO GARCÍA, MIGUEL | |
Paraules clau: | |
Reconocimiento de objetos, dron, video streaming | |
Descripció del contingut i pla d'activitats: | |
En la actualidad se dispone de una plataforma llamada Drone Engineering Ecosystem (DEE) que permite la gestión de las operaciones de un dron (planificación de misiones, gestión de la información captada por el dron, etc.). La plataforma ha sido desarrollada a partir de las contribuciones de estudiantes de la EETAC, a través de sus TFM y TFG. En el desarrollo de esta plataforma se utilizan tecnologías avanzadas para el desarrollo de aplicaciones web, aplicaciones para móviles y aplicaciones de escritorio (python, vue, ionic, capacitor, git), así como las tecnologías más utilizadas en el mundo de sistemas de drones de código abierto (ardupilot, dronekit, mission planner, etc.). Más información sobre DEE puede encontrarse en https://github.com/dronsEETAC/DroneEngineeringEcosystemDEE
El objetivo inicial de este proyecto es la incorporación de un módulo de reconocimiento de objetos que se utilizarán para guiar la trayectoria del dron. Adicionalmente, se investigarán y evaluarán diferentes técnicas para la transmisión de video a los dispositivos de tierra. Las funcionalidades desarrolladas se pondrán a prueba en el dronLab ubicado en el Campus. |
|
Overview (resum en anglès): | |
The main objective of this final degree project is to develop software that allows a drone to identify and classify objects in real time through the use of a camera. To achieve this, advanced computer vision techniques and artificial intelligence algorithms have been used, ensuring the correct integration between the developed software and the drone control system. A specific use case is to guide the movement of the drone following a route marked by objects strategically located on the ground.
The methodology used to meet the objectives set out focused on the implementation of an object detection module based on the YOLO (You Only Look Once) algorithm, a convolutional neural network optimised for object detection in real time. The module was developed in Python, and its integration into the Drone Engineering Ecosystem (DEE), a drone control platform, enabled the identification of objects and subsequent decision-making by the drone. During the development process, different YOLOv8 models (v8n, v8s, v8m, v8l, v8x) were selected and evaluated, and then retrained using a proprietary dataset that included classes such as banana, ball, box and backpack. Several tests were performed, both in simulated environments and in a laboratory with a real drone, to measure the accuracy and efficiency of the system. The results were satisfactory, achieving an improvement in object detection compared to pre-trained models, with accuracy increases of up to 53% in some cases. Despite the achievements, the project had limitations, such as the impossibility of implementing object detection on the drone's Raspberry Pi due to technical problems with the library used, which restricted image processing to the ground equipment. In addition, the resolution of the drone's camera was not optimal for detecting small objects, and some false positives were observed that diverted the drone from its route at times. In conclusion, the project demonstrated the effectiveness of integrating an advanced object detection system into the DEE, opening the door to future improvements in model accuracy and drone functionality. Future lines of development are suggested to optimise the system, such as the reduction of false positives and the integration of processing on the Raspberry Pi. |