Publicaciones

Found(s) 15 result(s)
Found(s) 2 page(s)

Images of a maize field in early growth stage

Digital.CSIC. Repositorio Institucional del CSIC
  • Herrera-Diaz, Jesus
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
The dataset is composed of several images named as: type of crop_date_number.png., [Methodological information] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 1.6 MP; FoV: 54.6° × 42.3°)., [Environmental/experimental conditions] The data were acquired by manually operating a mobile platform during different time periods and weather conditions in the same season., [People involved with sample collection, processing, analysis and/or submission] Jesus Herrera-Diaz (Methodology), Luis Emmi (Investigation), Pablo Gonzalez-de-Santos (Supervision)., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed




Images of a wheat field in early growth stage

Digital.CSIC. Repositorio Institucional del CSIC
  • Herrera-Diaz, Jesus
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
The dataset is composed of several images named as: type of crop_date_number.png., [Methodological information] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 1.6 MP; FoV: 54.6° × 42.3°)., [Environmental/experimental conditions] The data were acquired by manually operating a mobile platform during different time periods and weather conditions in the same season., [People involved with sample collection, processing, analysis and/or submission] Jesus Herrera-Diaz (Methodology), Luis Emmi (Investigation), Pablo Gonzalez-de-Santos (Supervision)., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed




Toward autonomous mobile robot navigation in early-stage crop growth

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Herrera-Diaz, Jesus
  • González-de-Santos, Pablo
19th International Conference on Informatics in Control, Automation and Robotics (ICINCO 2022), 14 - 16 July, 2022. Lisbon - Portugal., This paper presents a general procedure for enabling autonomous row following in crops during early-stage
growth, without relying on absolute localization systems. A model based on deep learning techniques (object
detection for wide-row crops and segmentation for narrow-row crops) was applied to accurately detect both
types of crops. Tests were performed using a manually operated mobile platform equipped with an RGB and
a time-of-flight (ToF) cameras. Data were acquired during different time periods and weather conditions, in
maize and wheat fields. The results showed the success on crop detection and enables the future development
of a fully autonomous navigation system in cultivated fields during early stage of crop growth., This article is part of a the WeLASER project that has received funding from the European Union’s Horizon
2020 research and innovation programme under grant agreement No 101000256, Peer reviewed




Images of a maize field with 2-3 leaf collars

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 3.2 MP; FoV: 54.6° × 42.3°)., This dataset consists of 322 jpg images obtained in an experimental maize field, at the Centre for Automation and Robotics (UPM-CSIC), Carr. Campo Real, km 0, 200, 28500 Arganda del Rey, Madrid, Spain, when the crop was in an early growth stage (approximately with 2-3 leaf collars). The images were taken using an RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens, mounted on an autonomous tractor developed within the WeLASER project. The images were taken at a height of approximately 0.5 meters above the ground and with an angle of 22 degrees. The images were taken on the same day, with changing lighting conditions., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of several images, arranged in 4 folders. Within each folder the images are ordered as follows: m13-m14-<folder_last_number>-<image_number>.jpg, Peer reviewed




Images of a sugar beet field with 2-3 leaves

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens (resolution: 3.2 MP; FoV: 54.6° × 42.3°)., This dataset consists of 225 jpg images obtained in an experimental maize sugar beet, at the Centre for Automation and Robotics (UPM-CSIC), Carr. Campo Real, km 0, 200, 28500 Arganda del Rey, Madrid, Spain, when the crop was in an early growth stage (approximately with 2-3 leaves). The images were taken using an RGB camera TRI016S-CC RGB from Lucid Vision Labs equipped with the SV-0614V lens, mounted on an autonomous tractor developed within the WeLASER project. The images were taken at a height of approximately 0.5 meters above the ground and with an angle of 22 degrees. The images were taken on the same day, with changing lighting conditions., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of several images, arranged in 4 folders. Within each folder the images are ordered as follows: m05-sbi-<folder_last_number><image_number>.jpg, Peer reviewed




Digital representation of smart agricultural environments for robot navigation

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Parra, Rebeca
  • González-de-Santos, Pablo
Proceedings of the 10th International Conference on Information and Communication Technologies in Agriculture, Food and Environment (HAICTA 2022). Information and Communication Technologies in Agriculture, Food and Environment. 22-25 September, 2022. Athens, Greece., In recent years, digitization has created a significant impact on food production systems, allowing various technologies and advanced data processing strategies to be implemented. Alongside the introduction of tools for the digitalization of the field, the automation of tasks through the use of mobile robots has also been growing in the last decades. These systems are nourished by the acquisition of field data to carry out autonomous operations including weed management, application of fertilizers, and harvesting, among others. One of the current challenges of integrating robotic solutions in the digital age of agriculture is the development of scalable and interoperable systems, able to manage data obtained from third parties. This work presents an overall methodology for map creation by using open-source tools, which allows data in the daily activities of an agricultural farm to be managed, and autonomous tasks of robotic systems to be planned and supervised., This article is part of the WeLASER project funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed




Exploiting the Internet Resources for Autonomous Robots in Agriculture

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • Fernández, Roemi
  • González-de-Santos, Pablo
  • Francia, Matteo
  • Golfarelli, Matteo
  • Vitali, Giuliano
  • Sandmann, Hendrik
  • Hustedt, Michael
  • Wollweber, Merve
Special Issue "Robots and Autonomous Machines for Agriculture Production", Autonomous robots in the agri-food sector are increasing yearly, promoting the application of precision agriculture techniques. The same applies to online services and techniques implemented over the Internet, such as the Internet of Things (IoT) and cloud computing, which make big data, edge computing, and digital twins technologies possible. Developers of autonomous vehicles understand that autonomous robots for agriculture must take advantage of these techniques on the Internet to strengthen their usability. This integration can be achieved using different strategies, but existing tools can facilitate integration by providing benefits for developers and users. This study presents an architecture to integrate the different components of an autonomous robot that provides access to the cloud, taking advantage of the services provided regarding data storage, scalability, accessibility, data sharing, and data analytics. In addition, the study reveals the advantages of integrating new technologies into autonomous robots that can bring significant benefits to farmers. The architecture is based on the Robot Operating System (ROS), a collection of software applications for communication among subsystems, and FIWARE (Future Internet WARE), a framework of open-source components that accelerates the development of intelligent solutions. To validate and assess the proposed architecture, this study focuses on a specific example of an innovative weeding application with laser technology in agriculture. The robot controller is distributed into the robot hardware, which provides real-time functions, and the cloud, which provides access to online resources. Analyzing the resulting characteristics, such as transfer speed, latency, response and processing time, and response status based on requests, enabled positive assessment of the use of ROS and FIWARE for integrating autonomous robots and the Internet., This article is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 101000256., Peer reviewed




Images and annotations of maize crop in early growth stages (1 - 8 leaves)

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
  • Cortinas, Eloisa
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI032S-CC RGB from Lucid Vision Labs equipped with the SV- 04514V lens (resolution: 5 MP; FoV: 59.4° × 79°)., This dataset consists of a set images of maize fields representing real-world conditions, acquired with medium-high angle perspective, containing five distinct growth stages based on the BBCH notation system. Within each folder are images in .jpg format and matching .txt files, containing the bounding boxes identifying the plants within the image., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of 5 folders, each folder containing a specific growth stage based on the BBCH notation system. Within each folder are images in .jpg files and matching .txt files, which contain the bounding boxes identifying the plants within the image., Peer reviewed




Images and annotations of sugar beet crop in early growth stages (2 - 8 leaves)

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
  • Cortinas, Eloisa
[Description of methods used for collection/generation of data] The data were acquired using the RGB camera TRI032S-CC RGB from Lucid Vision Labs equipped with the SV- 04514V lens (resolution: 5 MP; FoV: 59.4° × 79°)., This dataset consists of a set images of sugar beet fields representing real-world conditions, acquired with medium-high angle perspective, containing four distinct growth stages based on the BBCH notation system. Within each folder are images in .jpg format and matching .txt files, containing the bounding boxes identifying the plants within the image., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., The dataset is composed of 4 folders, each folder containing a specific crop and growth stage based on the BBCH notation system. Within each folder are images in .jpg files and matching .txt files, which contain the bounding boxes identifying the plants within the image., Peer reviewed




Data on the WeLASER robot carrying out a mission in the experimental fields of the Center for Automation and Robotics (UPM-CSIC)

Digital.CSIC. Repositorio Institucional del CSIC
  • Emmi, Luis Alfredo
  • González-de-Santos, Pablo
The dataset is made up of 4 files: (1) map.geojson: a GEOJSON file that describes the digital representation of the experimental field where the tests were carried out; (2) mission.geojson: a GEOJSON file describing the geographic coordinates of the planned mission between Building T and Field 3; (3) robot_path.geojson: a GEOJSON file describing the geographic coordinates of the robot performing the mission; (4) RobotLog.txt: a TXT file with the log generated by the robot during the execution of the mission., This database presents the results of the tests developed in the WeLASER experimental farm, setup in the facilities of the Centre for Automation and Robotics (UPM-CSIC). These experiments consisted of autonomous navigation of the WeLASER robot following a complete mission within the farm. This dataset contains the digital representation of the farm, as well as the geographic coordinates that represent the mission. Furthermore, the geographical coordinates that the robot followed during the execution of the mission are also presented, in addition to the log generated by the robot where the different actions and states that the robot performed at each instant of time are were recorded. This for the purpose of promoting transparency, reproducibility, and the sharing of our research findings., This dataset is part of a the WeLASER project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256., Peer reviewed