Document Type : Research Paper

Authors

1 PhD student , Department of Biosystem Mechanical Engineering, College of Abouraihan, University of Tehran, Tehran, Iran

2 Associate Professor, Department of Biosystem Mechanical Engineering, College of Abouraihan, University of Tehran, Tehran, Iran

3 Assistant Professor, Department of Biosystems Engineering, Gorgan University of Agricultural Sciences and Natural Resources, Gorgan, Iran

4 Associate Professor, Faculty of Computer Engineering, Iran University of science and technology, Tehran, Iran

Abstract

One of the most important issues in spraying fields and greenhouses is reducing the use of pesticides, reducing the dangerous effects of spraying, protecting the environment, improving the quality of spraying and increasing people's health. Children have weaker immune systems and are unable to detoxify toxic and harmful compounds. For this reason, the adverse effects of poisons on children's health are more important than adults, and the need to reduce the use of poisons and follow the principles of spraying to prevent children from developing cancer is twofold. In this study, the robot sprays by measuring the volume of plant mass and in order to reduce the consumption of poisons. The robot is mechanically designed to be able to move between rows of products and open its manipulator step by step and take deep pictures of each plant in front of it, then analyze the image of each section and observe the plant volume. Detect and spray the same section based on the calculated volume. The process of imaging, volume detection and spraying of the solution based on the estimated volume is repeated at each stage of manipulator opening until the height of the plant is completed and at the end the whole manipulator is retracted.
Robot acts intelligently in detecting plant height and closes in the last section after imaging and spraying the solution. The manipulator is able to assess and spray plants up to 270 cm in height. The above robot consists of different parts including camera chamber and nozzle, nozzle and Kinect American camera version 1, manipulator and manipulator actuator mechanism, pump and solution tank, processor, Arduino and relay boards, cart and robot actuator system. To design the above robot, first the static forces applied to the manipulators were examined and then the kinematic calculations of the manipulator were performed. The result of the calculations showed the accuracy of the kinematic equations. After performing calculations to design the robot, examining the environmental conditions and considering the construction cost, the three-dimensional model of the robot was designed in Solidworks 2016 software and based on the above model, the construction work was done step by step. The robot is controlled by Matlab 2010 software. The entire robot working algorithm is coded in Matlab software. For this reason, the main part of controlling the robot is the laptop processor. The laptop controlled by the robot is located in the built-in place behind the robot and transmits all the robot commands to the set of operators through the Arduino board and the relay board. The input information is transmitted to the processor by the Kinect camera, and the processor makes the necessary decisions according to the coded program. Finally, the output commands from the processor are transferred to Arduino board and the relay board to start the actuators. ADM A10-4655M APU processor was used. Developer Toolkit Browser v1.8.0, KinectExplorer-D2D, and Kinect for Windows Software Development Kit (SDK) were used to connect the Kinect camera to a Windows laptop. Two coefficients α and β are needed to determine the plant volume in each section. α is the average plant volume of several plants that has been calculated manually and β is the correction factor multiplied by the amount of plant volume estimated by the robot so that the actual volume of sprayed solution is more in line with the plant needs and the opinion of relevant experts. The volume estimated by the robot in each section is the product of the volume factor multiplied by the average plant volume of the plant (α). The volume factor is the average observed plant width (M) divided by the distance between two consecutive plants in pixels (D). Multiply the volume of the plant observed in the section by multiplying the volume factor by the calculated volume (α) using the Scale Invariant method (independent of the distance from the camera to the object).
To calculate the average plant volume manually, several plants should be selected randomly and the plant volume should be calculated by computational methods or flooding method. Then introduced the average volume of these few plants as α to the program. Therefore, the more accurately the manual volume is calculated, and the greater the number of selected plants, Finally, the value of α and the final volume of the plant will be calculated more accurately. The robot should be able to spray the right amount of solution depending on the type of plant and its conditions. Spraying the solution to the plant may not be scientifically justified by experts and specialists according to the type of plant, time of spraying, poison concentration and plant needs. Therefore, the correction factor β should be multiplied by the volume estimated by the robot to the actual volume. Spray the solution to the plant according to the needs of the plant and the opinion of experts. The results of the evaluation show that the robot is able to spray different amounts of solution in the detection of plants with different volumes and the amount of solution sprayed by the robot was proportional to the volume of plants. The average volume of solution sprayed by the robot is 27.1 cc and the average volume of solution sprayed by the worker is 33.1 cc. Also, the standard deviation of the average volume of solution sprayed by the robot and the worker is 2.94 and 3.11, respectively. In other words, the robot is able to spray more accurately and the amount of poison consumption in the robot is estimated less than the worker. It was mentioned that the evaluation of the robot is reported in order to reduce the consumption of acceptable poisons. The feature of being online includes collecting plant information and spraying the solution moments after data processing is one of the important features of the above research. Also, the ability of the robot in online and scale invariant (independent of the distance from the camera to the object) evaluation of the robot was considered acceptable and useful.

Keywords

Main Subjects

  1. Adamides, G., Katsanos, C., Parmet, Y., Christou, G., Xenos, M., Hadzilacos, T., and Edan, Y. 2017. HRI usability evaluation of interaction modes for a teleoperated agricultural robotic sprayer. Journal of Applied Ergonomics, 62: 237-246.
  2. Berge, T., Goldberg, S.,  Kaspersen, K., and Netland, J. 2012. Towards machine vision based site-specific weed management in cereals. Journal of Computers and Electronics in Agriculture, 81: 79-86.
  3. Burks, T. F., Subramanian, V., and Singh, S. 2005. Autonomous dreenhouse sprayer vehicle using machine vision and ladar for steering control. Journal of Automation Technology for Off-Road Equipment. 79-90.
  4. Cantelli, L., Bonaccorso, F., Longo, D., Melita, C. D., Schillaci, G., and Muscato, G. 2019. A Small Versatile Electrical Robot for Autonomous Spraying in Agriculture. Journal of Agricultural Engineering, 1(3): 391-402.
  5. Gao, Z., Zhang, J., Geng, C., and Li, W. 2010. Control system of target spraying robot in greenhouse. Journal of  Transactions of the Chinese Society of Agricultural Engineering, 26(1): 228-233.
  6. Gonzal-de-Soto, M., Emmi, L., Perez-Ruiz, M., Aguera, J., and Gonzal-de-Satos, P. 2016. Autonomous systems for precise spraying e Evaluation of a robotised patch sprayer. Journal of Biosystems Engineering, 146: 165-182.
  7. Hejazipoor, H., Massah, J., Soryani, M., Asefpour Vakiliana, K., and Cheginia, GH. 2021. An intelligent spraying robot based on plant bulk volume. Journal of Computers and Electronics in Agriculture, Vol 180.
  8. Hu, Y., Wang, L., Xiang, L., Wu, Q., and Jiang, H. 2018. Automatic Non-Destructive Growth Measurement of Leafy Vegetables Based on Kinect. Journal of Sensors. 18:
  9. Liang, X., Kankare, V., Hyyppä, J., Wang, Y., Kukko, A., Haggrén, H., Yu, X., Kaartinen, H., Jaakkola, A., Guan, F., Holopainen, M., and Vastaranta, M. 2016. Terrestrial laser scanning in forest inventories. Journal of Photogrammetry and Remote Sensing, 115: 63–77.
  10. Oberti, O., Marchi, M., Tirelli, P., Calcante, A., Iriti, M., Tona, E., Hoceva, M., Baur. J., Pfaff, J., Schutz, C., and Ulbrich, H. 2016. Selective spraying of grapevines for disease control using a modular agricultural robot. Journal of Biosystems Engineering, 146: 203 – 215.
  11. Ogawa, Y., Kondo, N., Monta, M., and Shibusawa, S. 2006. Spraying Robot for Grape Production. Journal of Field and Service Robotics, 24: 539-548.
  12. Olofsson, K., Holmgren, J., and Olsson, H. 2014. Tree Stem and Height Measurements using Terrestrial Laser Scanning and the RANSAC Algorithm. Journal of Remote Sensing, 6: 4323-4344.
  13. Olofsson, K., and Holmgren, J. 2017. Tree Stem and Canopy Biomass Estimates From Terrestrial Laser Scanning Data. Journal of Remote Sensing and Spatial Information Sciences, 42: 157-160.
  14. Rafigh, A., Mashadi Meighani, H., Kalantari, D., and Mosavi Khorasani, M. 2013. Greenhouse Spraying Automation Using Mobile Robots. Journal of Mechanical Sciences in agricultural machinery, Vol 1, No 1. (in Persian with English abstract).