Publicaciones

Found(s) 15 result(s)
Found(s) 1 page(s)

Images of a maize field in early growth stage

Digital.CSIC. Repositorio Institucional del CSIC
  • Herrera-Diaz, Jesus
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
The dataset is composed of several images named as: type of crop_date_number.png., [Methodological information] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 1.6 MP; FoV: 54.6° × 42.3°)., [Environmental/experimental conditions] The data were acquired by manually operating a mobile platform during different time periods and weather conditions in the same season., [People involved with sample collection, processing, analysis and/or submission] Jesus Herrera-Diaz (Methodology), Luis Emmi (Investigation), Pablo Gonzalez-de-Santos (Supervision)., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed




Images of a wheat field in early growth stage

Digital.CSIC. Repositorio Institucional del CSIC
  • Herrera-Diaz, Jesus
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
The dataset is composed of several images named as: type of crop_date_number.png., [Methodological information] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 1.6 MP; FoV: 54.6° × 42.3°)., [Environmental/experimental conditions] The data were acquired by manually operating a mobile platform during different time periods and weather conditions in the same season., [People involved with sample collection, processing, analysis and/or submission] Jesus Herrera-Diaz (Methodology), Luis Emmi (Investigation), Pablo Gonzalez-de-Santos (Supervision)., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed




Toward autonomous mobile robot navigation in early-stage crop growth

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Herrera-Diaz, Jesus
  • González-de-Santos, Pablo
19th International Conference on Informatics in Control, Automation and Robotics (ICINCO 2022), 14 - 16 July, 2022. Lisbon - Portugal., This paper presents a general procedure for enabling autonomous row following in crops during early-stage
growth, without relying on absolute localization systems. A model based on deep learning techniques (object
detection for wide-row crops and segmentation for narrow-row crops) was applied to accurately detect both
types of crops. Tests were performed using a manually operated mobile platform equipped with an RGB and
a time-of-flight (ToF) cameras. Data were acquired during different time periods and weather conditions, in
maize and wheat fields. The results showed the success on crop detection and enables the future development
of a fully autonomous navigation system in cultivated fields during early stage of crop growth., This article is part of a the WeLASER project that has received funding from the European Union’s Horizon
2020 research and innovation programme under grant agreement No 101000256, Peer reviewed




Images of a maize field with 2-3 leaf collars

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 3.2 MP; FoV: 54.6° × 42.3°)., This dataset consists of 322 jpg images obtained in an experimental maize field, at the Centre for Automation and Robotics (UPM-CSIC), Carr. Campo Real, km 0, 200, 28500 Arganda del Rey, Madrid, Spain, when the crop was in an early growth stage (approximately with 2-3 leaf collars). The images were taken using an RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens, mounted on an autonomous tractor developed within the WeLASER project. The images were taken at a height of approximately 0.5 meters above the ground and with an angle of 22 degrees. The images were taken on the same day, with changing lighting conditions., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of several images, arranged in 4 folders. Within each folder the images are ordered as follows: m13-m14-<folder_last_number>-<image_number>.jpg, Peer reviewed




Images of a sugar beet field with 2-3 leaves

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 3.2 MP; FoV: 54.6° × 42.3°)., This dataset consists of 225 jpg images obtained in an experimental maize sugar beet, at the Centre for Automation and Robotics (UPM-CSIC), Carr. Campo Real, km 0, 200, 28500 Arganda del Rey, Madrid, Spain, when the crop was in an early growth stage (approximately with 2-3 leaves). The images were taken using an RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens, mounted on an autonomous tractor developed within the WeLASER project. The images were taken at a height of approximately 0.5 meters above the ground and with an angle of 22 degrees. The images were taken on the same day, with changing lighting conditions., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of several images, arranged in 4 folders. Within each folder the images are ordered as follows: m05-sbi-<folder_last_number><image_number>.jpg, Peer reviewed




Digital representation of smart agricultural environments for robot navigation

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Parra, Rebeca
  • González-de-Santos, Pablo
Proceedings of the 10th International Conference on Information and Communication Technologies in Agriculture, Food and Environment (HAICTA 2022). Information and Communication Technologies in Agriculture, Food and Environment. 22-25 September, 2022. Athens, Greece., In recent years, digitization has created a significant impact on food production systems, allowing various technologies and advanced data processing strategies to be implemented. Alongside the introduction of tools for the digitalization of the field, the automation of tasks through the use of mobile robots has also been growing in the last decades. These systems are nourished by the acquisition of field data to carry out autonomous operations including weed management, application of fertilizers, and harvesting, among others. One of the current challenges of integrating robotic solutions in the digital age of agriculture is the development of scalable and interoperable systems, able to manage data obtained from third parties. This work presents an overall methodology for map creation by using open-source tools, which allows data in the daily activities of an agricultural farm to be managed, and autonomous tasks of robotic systems to be planned and supervised., This article is part of the WeLASER project funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed




Exploiting the Internet Resources for Autonomous Robots in Agriculture

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Fernández, Roemi
  • González-de-Santos, Pablo
  • Francia, Matteo
  • Golfarelli, Matteo
  • Vitali, Giuliano
  • Sandmann, Hendrik
  • Hustedt, Michael
  • Wollweber, Merve
Special Issue "Robots and Autonomous Machines for Agriculture Production", Autonomous robots in the agri-food sector are increasing yearly, promoting the application of precision agriculture techniques. The same applies to online services and techniques implemented over the Internet, such as the Internet of Things (IoT) and cloud computing, which make big data, edge computing, and digital twins technologies possible. Developers of autonomous vehicles understand that autonomous robots for agriculture must take advantage of these techniques on the Internet to strengthen their usability. This integration can be achieved using different strategies, but existing tools can facilitate integration by providing benefits for developers and users. This study presents an architecture to integrate the different components of an autonomous robot that provides access to the cloud, taking advantage of the services provided regarding data storage, scalability, accessibility, data sharing, and data analytics. In addition, the study reveals the advantages of integrating new technologies into autonomous robots that can bring significant benefits to farmers. The architecture is based on the Robot Operating System (ROS), a collection of software applications for communication among subsystems, and FIWARE (Future Internet WARE), a framework of open-source components that accelerates the development of intelligent solutions. To validate and assess the proposed architecture, this study focuses on a specific example of an innovative weeding application with laser technology in agriculture. The robot controller is distributed into the robot hardware, which provides real-time functions, and the cloud, which provides access to online resources. Analyzing the resulting characteristics, such as transfer speed, latency, response and processing time, and response status based on requests, enabled positive assessment of the use of ROS and FIWARE for integrating autonomous robots and the Internet., This article is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 101000256., Peer reviewed




Images and annotations of maize crop in early growth stages (1 - 8 leaves)

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
  • Cortinas, Eloisa
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI032S-CC RGB from Lucid Vision Labs equipped with the SV- 04514V lens (resolution: 5 MP; FoV: 59.4° × 79°)., This dataset consists of a set images of maize fields representing real-world conditions, acquired with medium-high angle perspective, containing five distinct growth stages based on the BBCH notation system. Within each folder are images in .jpg format and matching .txt files, containing the bounding boxes identifying the plants within the image., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of 5 folders, each folder containing a specific growth stage based on the BBCH notation system. Within each folder are images in .jpg files and matching .txt files, which contain the bounding boxes identifying the plants within the image., Peer reviewed




Images and annotations of sugar beet crop in early growth stages (2 - 8 leaves)

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
  • Cortinas, Eloisa
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI032S-CC RGB from Lucid Vision Labs equipped with the SV- 04514V lens (resolution: 5 MP; FoV: 59.4° × 79°)., This dataset consists of a set images of sugar beet fields representing real-world conditions, acquired with medium-high angle perspective, containing four distinct growth stages based on the BBCH notation system. Within each folder are images in .jpg format and matching .txt files, containing the bounding boxes identifying the plants within the image., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of 4 folders, each folder containing a specific crop and growth stage based on the BBCH notation system. Within each folder are images in .jpg files and matching .txt files, which contain the bounding boxes identifying the plants within the image., Peer reviewed




Data on the WeLASER robot carrying out a mission in the experimental fields of the Center for Automation and Robotics (UPM-CSIC)

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
The dataset is made up of 4 files: (1) map.geojson: a GEOJSON file that describes the digital representation of the experimental field where the tests were carried out; (2) mission.geojson: a GEOJSON file describing the geographic coordinates of the planned mission between Building T and Field 3; (3) robot_path.geojson: a GEOJSON file describing the geographic coordinates of the robot performing the mission; (4) RobotLog.txt: a TXT file with the log generated by the robot during the execution of the mission., This database presents the results of the tests developed in the WeLASER experimental farm, setup in the facilities of the Centre for Automation and Robotics (UPM-CSIC). These experiments consisted of autonomous navigation of the WeLASER robot following a complete mission within the farm. This dataset contains the digital representation of the farm, as well as the geographic coordinates that represent the mission. Furthermore, the geographical coordinates that the robot followed during the execution of the mission are also presented, in addition to the log generated by the robot where the different actions and states that the robot performed at each instant of time are were recorded. This for the purpose of promoting transparency, reproducibility, and the sharing of our research findings., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed




Enabling Autonomous Navigation on the Farm: A Mission Planner for Agricultural Tasks

Digital.CSIC. Repositorio Institucional del CSIC
  • Cordova-Cardenas, Ruth
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
This study presents the development of a route planner, called Mission Planner, for an
agricultural weeding robot that generates efficient and safe routes both in the field and on the farm
using a graph-based approach. This planner optimizes the robot’s motion throughout the farm
and performs weed management tasks tailored for high-power laser devices in narrow-row crops
(wheat, barley, etc.) and wide-row crops (sugar beet, maize, etc.). Three main algorithms were
integrated: Dijkstra’s algorithm to find the most optimal route on the farm, the VRMP (Visibility
Road-Map Planner) method to select the route within cultivated fields when roads are not visible,
and an improved version of the Hamiltonian path to find the best route between the crop lines. The
results support the effectiveness of the strategies implemented, demonstrating that a robot can safely
and efficiently navigate through the entire farm and perform an agricultural treatment, in this case
study, in laser-based weed management. In addition, it was found that the route planner reduced the
robot’s operation time, thus improving the overall efficiency of precision agriculture., This article is part of a project that has received funding from the European Union’s
Horizon 2020 research and innovation program under grant agreement No 101000256., Peer reviewed




Crop Identification and Growth Stage Determination for Autonomous Navigation of Agricultural Robots

Digital.CSIC. Repositorio Institucional del CSIC
  • Cortinas, Eloisa
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
This study introduces two methods for crop identification and growth stage determination, focused primarily on enabling mobile robot navigation. These methods include a two-phase approach involving separate models for crop and growth stage identification and a one-phase method employing a single model capable of handling all crops and growth stages. The methods were validated with maize and sugar beet field images, demonstrating the effectiveness of both approaches. The one-phase approach proved to be advantageous for scenarios with a limited variety of crops, allowing, with a single model, to recognize both the type and growth state of the crop and showed an overall Mean Average Precision (mAP) of about 67.50%. Moreover, the two-phase method recognized the crop type first, achieving an overall mAP of about 74.2%, with maize detection performing exceptionally well at 77.6%. However, when it came to identifying the specific maize growth state, the mAP was only able to reach 61.3% due to some difficulties arising when accurately categorizing maize growth stages with six and eight leaves. On the other hand, the two-phase approach has been proven to be more flexible and scalable, making it a better choice for systems accommodating a wide range of crops, This article is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No. 101000256., Peer reviewed




WeLASER robot logs executing various missions between January and September 2023

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
[Description of methods used for collection/generation of data] The data were acquired during diverse missions executed by the WeLASER robot., This database presents diverse logs of the results of the tests developed with the WeLASER robot, over a period of 9 months, between January and September 2023, in 3 diverse experimental farms in Spain, The Netherlands and Denmark, in real crop fields, executing autonomous navigation operations. The logs include data collected by the robot's various guidance controllers, including the spiral controller and the lateral controller. The dataset also includes the log information already extracted, in two .mat files, and a MATLAB script that allows graphing and analyzing said information. This for the purpose of promoting transparency, reproducibility, and the sharing of our research findings., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is made up of: (a) a folder that includes all the logs of the different controllers executing various missions; (b) two .mat files that include the information extracted from said logs; and (c) a matlab script that allows graphing and analyzing the information contained in the two .mat files., Peer reviewed




A Mission Planner for Autonomous Tasks in Farms

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Cordova-Cardenas, Ruth
  • González-de-Santos, Pablo
Sixth Iberian Robotics Conference will take place at the University of Coimbra on November 22-24, 2023., This research introduces a Mission Planner, a route optimization sys-tem for agricultural robots. The primary goal is to enhance weed management efficiency using laser technology in narrow-row crops like wheat and barley and wide-row crops like beets and maize. The Mission Planner relies on graph-based approaches and incorporates a range of algorithms to generate efficient and se-cure routes. It employs three key algorithms: (i) Dijkstra algorithm for identifying the most optimal farm route, (ii) Visibility Road-Map Planner (VRMP) to select paths in cultivated fields where visibility is limited, and (iii) an enhanced version of the Hamiltonian path for determining the optimal route between crop lines. This Mission Planner stands out for its versatility and adaptability, owing to its emphasis on graphs and the diverse algorithms it employs for various tasks. This adaptability allows it to provide multiple functions, making it applicable beyond a specific role. Furthermore, its ability to adjust to different agricultural robot sizes and specifications is a significant advantage, as it enables tailored program-ming to meet safety and movement requirements specific to each robot. These research results affirm the effectiveness of the implemented strategies, demon-strating that a robot can confidently and effectively traverse the entire farm while performing weed management tasks, specifically laser-based weed management., This article is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 101000256., Peer reviewed




An Efficient Guiding Manager for Ground Mobile Robots in Agriculture

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Fernández Saavedra, Roemi
  • González-de-Santos, Pablo
Mobile robots have become increasingly important across various sectors and are now essential in agriculture due to their ability to navigate effectively and precisely in crop fields. Navigation involves the integration of several technologies, including robotics, control theory, computer vision, and artificial intelligence, among others. Challenges in robot navigation, particularly in agriculture, include mapping, localization, path planning, obstacle detection, and guiding control. Accurate mapping, localization, and obstacle detection are crucial for efficient navigation, while guiding the robotic system is essential to execute tasks accurately and for the safety of crops and the robot itself. Therefore, this study introduces a Guiding Manager for autonomous mobile robots specialized for laser-based weeding tools in agriculture. The focus is on the robot’s tracking, which combines a lateral controller, a spiral controller, and a linear speed controller to adjust to the different types of trajectories that are commonly followed in agricultural environments, such as straight lines and curves. The controllers have demonstrated their usefulness in different real work environments at different nominal speeds, validated on a tracked mobile platform with a width of about 1.48 m, in complex and varying field conditions including loose soil, stones, and humidity. The lateral controller presented an average absolute lateral error of approximately 0.076 m and an angular error of about 0.0418 rad, while the spiral controller presented an average absolute lateral error of about 0.12 m and an angular error of about 0.0103 rad, with a horizontal accuracy of about ±0.015 m and an angular accuracy of about ±0.009 rad, demonstrating its effectiveness in real farm tests., This article is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No. 101000256, Peer reviewed