Agricultural Robot

Authors

  • Suryanaga Raju G. Student, Saveetha School of Engineering, SIMATS, Chennai, India Author
  • Puviarasi R. Saveetha School of Engineering, SIMATS, Chennai, India Author

DOI:

https://doi.org/10.61841/z7xntc58

Keywords:

Robot, Agriculture, Design aspects, environment

Abstract

Firstly, for any AGRI robot, the location is the priority because, on the basis of the location, the robot will work in the fixed area to work. Secondary is path deduction in crops. There are ROWs to detect those rows. We need the path deduction in the path deduction. We need image processing techniques for the high efficiency of the yield crop. For this, we need the autonomous robot of the farming process, saving the time and energy required for performing repetitive farming tasks and increasing the productivity of yield by treating every crop individually using the precision farming concept. The experiment demonstrates that the proposed method has efficiency and detection accuracy. Coverage of a partially known workshop for information gathering is the core problem for several applications, such as search and rescue precision agriculture and monitoring of critical infrastructures. The planning efficiency dynamically. 

Downloads

Download data is not yet available.

References

1. GulamAmer, S.M.M. Mudassir, M.A Malik, "Plan and Operation of Wi-Fi Agribot Integrated System", IEEE Worldwide Conference on Industrial Instrumentation and Control, May 2015

2. Fernando A. AuatCheein and RiCardo Li, "Agribusiness Robotics: Unmanned Robotic Service Units in creating Entanglements," IEEE present day gear magazine, Sep2013

3. SajjadYaghoubi, Negar Ali Akbarzadeh, ShadiSadeghiBazargani, "Self-managing Robots for Agricultural Tasks and Ranch Assignment and Future Trends in Agro Robots", International Journal of Mechanical and Mechatronics Engineering, June 2013

4. Pavan, C., Dr. B. Sivakumar, "Wi-Fi Robot Video Surviellance Monitoring," System International Journal of Scientific and Engineering Research, August-2012

5. Tijmen Bakker, Kees van Asselt, Jan Bontsema, Joachim Muller, and Geritt van straten, "A Way Following Means Supportive Robots," Springer Science Business Media, Vol. 29, pp. 85-97, 2010. John Billingsley, Denny Oetomo, "Agrarian Robotics", IEEE Robotics and Automation Magazine, December 2009

6. D. C. Butcher, D. K. Giles, and D. Downey, "Free robotized weed control systems: A review," ElseveirComput. Electron.Agric, vol. 61, no. 1, pp. 63–78, 2008.

7. R. Eaton, J. Katupitiya, K. W. Siew, and B. Howarth, "Free making: Modeling and control of plant gear in a bound together structure," IEEE Int. Conf. Mechatronics and Machine Vision Practice, Dec. 2008, vol. 1, pp. 499-504

8. N. Chebrolu, T. Labe, and C. Stachniss. Stunning expanded-length registration¨ of UAV pictures of yield fields for precision agribusiness. IEEE Robotics and Automation Letters, 3(4):3097–3104, 2018.

9. G. Christie, G. Warnell, and K. Kochersberger. Semantics for UGV assurance in GPS-denied conditions. arXiv preprint, 2016.

10. F. Dellaert, D. Fox, W. Burgard, and S. Thrun. Monte carloconstrainment for versatile robots. In IEEE International Conference on Robotics and Automation (ICRA), May 1999.

11. M. Ding, K. Lyngbaek, and A. Zakhor.Revamped enrollment of raised imagery with untextured 3D lidar models. In 2008 IEEE Conference on Computer Vision and Pattern Recognition, pages 1–8, June 2008.

12. J. Dong, J.G. Burnham, B. Boots, G. Storms, and F. Dellaert. 4D Crop Monitoring: Spatio-Temporal Reconstruction for Agriculture. In Proc. of the IEEE International Conf. on Robotics and Automation (ICRA), 2017.

13. W. Forstner and B. Wrobel. Photogrammetric Computer Vision: Statistics, Geometry, Orientation, and Reconstruction. Springer Verlag, 2016.

14. F. Kraemer, A. Schaefer, A. Eitel, J. Vertens, and W. Burgard. From Plants to Landmarks: Time-invariant Plant Localization that uses Deep Pose Regression in Agricultural Fields. In IROS Workshop on AgriFood Robotics, 2017.

15. Balan B, Tech M. "Sensor-based smart agriculture using IOT,” International Journal of MC Square Scientific Research, vol. 9, no. 2, 2017.

16. R. Kummerle, B. Steder, C. Dornhege, A. Kleiner, G. Grisetti, and W. Burgard. Enormous scale outline-based pound utilizing airborne pictures as earlier data. Free Robots, 30(1):25–39, Jan 2011.

17. T.B. Kwon and J.B. Song. Another part ordinarily sawed from air and ground for outside repression with a rise map worked by raised mapping structure. Diary of Field Robotics, 28(2):227–240, 2010.

18. K. Y. K. Leung, C. M. Clark, and J. P. Huissoon.Constraint in urban conditions by sorting out ground-level video pictures with an airborne picture. In 2008 IEEE International Conference on Robotics and Automation, pages 551–556, May 2008.

19. S. Leutenegger, M. Chli, and R. Siegwart. Vivacious: Binary solid invariant adaptable keypoints. In Proc. of the IEEE International Conf. on Computer Vision (ICCV), 2011.

20. P. Lottes, J. Behley, N. Chebrolu, A. Milioto, and C. Stachniss. Joint stem exposure and harvest weed depiction for plant-express treatment in exactness creating. In Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2018.

21. D.G. Lowe. Explicit Image Features from Scale-Invariant Keypoints. Intl. Diary of Computer Vision (IJCV), 60(2):91–110, 2004.

Downloads

Published

30.04.2020

How to Cite

G. , S. R., & R., P. (2020). Agricultural Robot. International Journal of Psychosocial Rehabilitation, 24(2), 5900-5903. https://doi.org/10.61841/z7xntc58