[
Ahlin Konrad et al., Autonomous leaf picking using deep learning and visual-servoing, IFAC PapersOnLine 49.16, (2016), 177-183.
]Search in Google Scholar
[
Bateux Q., Going further with direct visual servoing, Ph.D. Thesis, Universite, Rennes 1, (2018).
]Search in Google Scholar
[
Bateux Q., Marchand E., Leitner J., Chaumette F., Corke P., Visual servoing from deep neural networks, arXiv:1705.08940, (2017).
]Search in Google Scholar
[
Bateux Q., Marchand E., Leitner J., Chaumette F., Corke P., Training deep neural networks for visual servoing, In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 1-8. IEEE, (2018).
]Search in Google Scholar
[
Bromley J., Guyon I., LeCun Y., Säckinger E., Shah R., Signature verification using a ‘Siamese’ time delay neural network, In: Proc. Adv. Neural Inf. Process. Syst., 1994, pp. 737-744.
]Search in Google Scholar
[
Chang W.C., Precise positioning of binocular eye-to-hand robotic manipulators, Journal of Intelligent Robot System, 49(1):219-236, (2007).
]Search in Google Scholar
[
Chaumette F., A first step toward visual servoing using image moments, Proc. of IEEE / RSJ IROS, 378-438, (2002).
]Search in Google Scholar
[
Chaumette F., Image moments: a general and useful set of features for visual servoing, IEEE Trans. on Robotics, 20(4), 713-723, (2004).
]Search in Google Scholar
[
Chaumette F., Hutchinson S., Handbook of Robotics, Springer, (2008).
]Search in Google Scholar
[
Chaumette F., Hutchinson S., Visual Servo Control Part I: Basic Approaches, IEEE Robotics & Automation Magazine, 13(4), 82-90, (2006).
]Search in Google Scholar
[
Chaumette F., Rives P., Espiau B., Positioning a robot with respect to an object, tracking it and estimating its velocity by visual servoing, Proc. of the IEEE International Conference on Robotics and Automation, 2248-2253, (1991).
]Search in Google Scholar
[
Chaumette F., Potential problems of stability and convergence in image-based and position-based visual servoing, In the Conference of Vision and Control, Series 140 Lecture Notes in Control and Information Science, vol. 237, pp. 66-78, Verlag, New York, (1998).
]Search in Google Scholar
[
Cheng H. et al., Deep learning for manipulator visual positioning, 2018 IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), IEEE, 2018.
]Search in Google Scholar
[
Cheng H., Wang Y., Meng M.Q.-H., A Vision-Based Robot Grasping System, in IEEE Sensors Journal, Vol. 22, No. 10, pp. 9610-9620, 15 May15, 2022.
]Search in Google Scholar
[
Chesi G., Hashimoto K., Prattichizzo D., Vicino A., Keeping features in the field of view in eye-in-hand visual servoing: a switching approach, IEEE Trans. Robot, 20(5), 908-914, (2004).
]Search in Google Scholar
[
Copoț C., Tehnici de control pentru sistemele servoing vizuale, PhD Thesis, “Gheorghe Asachi” Technical University of Iași, (2012).
]Search in Google Scholar
[
Collewet C., Chaumette F., Positioning a camera with respect to planar objects of unknown shape by coupling 2-D visual servoing and 3-D estimations, IEEE Trans. Robot. Autom. 18(3), 322-333, (2002).
]Search in Google Scholar
[
Gao J., He Y., Chen Y., Li Y., Learning end-to-end visual servoing using an improved soft actor-critic approach with centralized novelty measurement, IEEE Transactions on Instrumentation and Measurement, 72, 1-12, (2023).
]Search in Google Scholar
[
Guo J., Nguyen H.T., Liu C., Cheah C.C., Convolutional neural network-based robot control for an eye-in-hand camera, IEEE Transactions on Systems, Man, and Cybernetics: Systems, 53(8), 4764-4775, (2023).
]Search in Google Scholar
[
Gubbi M.R., Bell M.A.L., Deep learning-based photoacoustic visual servoing: Using outputs from raw sensor data as inputs to a robot controller, In 2021 IEEE International Conference on Robotics and Automation (ICRA), pp. 14261-14267, (2021).
]Search in Google Scholar
[
Hao T., Xu D., Robotic grasping and assembly of screws based on visual servoing using point features, The International Journal of Advanced Manufacturing Technology, 129(9), 3979-3991, (2023).
]Search in Google Scholar
[
Harish Y.V.S., DFVS: Deep flow guided scene agnostic image based visual servoing. 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2020.
]Search in Google Scholar
[
He Y., Gao J., Chen Y., Deep learning-based pose prediction for visual servoing of robotic manipulators using image similarity, Neurocomputing, 491, 343-352, (2022).
]Search in Google Scholar
[
Hill J., Park W.T., Real-time control of a robot with mobile-camera, 9th International Symposium on Industrial Robots, pp. 233-246, March 1979.
]Search in Google Scholar
[
Hancock J., Langer D., Active laser radar for high-performance measurements, In Proc. of IEEE International Conference on Robotics and Automation (ICRA), vol. 2, pp. 1465-1470, 1998.
]Search in Google Scholar
[
Hutchinson S., Hager G., Corke P., A tutorial on visual servo control, IEEE Transactions on Robotics and Automation, 12(5), (1996), 651-670.
]Search in Google Scholar
[
Katara P., Harish Y.V.S., Pandya H., Gupta A., Sanchawala A., Kumar G., Krishna M., Deepmpcvs: Deep model predictive control for visual servoing, In Conference on Robot Learning, pp. 2006-2015, (2021).
]Search in Google Scholar
[
Lazo Jorge F. et al., Autonomous intraluminal navigation of a soft robot using deep-learning-based visual servoing, 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2022.
]Search in Google Scholar
[
Liu J., Li Y., An Image Based Visual Servo Approach with Deep Learning for Robotic Manipulation, arXiv preprint arXiv:1909.07727, (2019).
]Search in Google Scholar
[
Mahony R., Corke P., Chaumette F., Choice of image features for depth-axis control in image based visual servo control, Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, Switzerland, (2002), 390-395.
]Search in Google Scholar
[
Malis E., Chaumette F., Boudet S., 2 1/2 d visual servoing, IEEE Trans. Robot. Autom. 15(2), 238-250, (1999).
]Search in Google Scholar
[
Marchand E., Chaumette F., Feature tracking for visual purposes, In Robotics and Systems, 52(1), 53-70, (2005).
]Search in Google Scholar
[
Marchand E., Subspace-based direct visual servoing, IEEE Robot. Autom. Lett. 4(3), 2699-2706, (2019).
]Search in Google Scholar
[
Marchand E., Direct visual servoing in the frequency domain, IEEE Robot. Autom. Lett. 5(2), 620-627, (2020).
]Search in Google Scholar
[
Nicholas A., Van-Thach D., Quang-Cuong P., DFBVS: Deep Feature-Based Visual Servo. arXiv preprint arXiv:2201.08046, (2022).
]Search in Google Scholar
[
Ribeiro E.G., Mendes R-Q., Grassi V.Jr., Real-time deep learning approach to visual servo control and grasp detection for autonomous robotic manipulation. Robotics and Autonomous Systems 139, (2021).
]Search in Google Scholar
[
Saxena A., Pandya H., Kumar G., Gaud A., Exploring convolutional networks for endto-end visual servoing, In: 2017 IEEE International Conference on Robotics and Automation, ICRA, IEEE, Marina Bay Sands, Singapore, 2017, pp. 3817-3823.
]Search in Google Scholar
[
Shi L., Copot C., Vanlanduit S., A bayesian deep neural network for safe visual servoing in human–robot interaction, Frontiers in Robotics and AI, 8, 687031, (2021).
]Search in Google Scholar
[
Tang J., Kim H., Guizilini V., Pillai S., Ambrus R., Neural outlier rejection for self-supervised keypoint learning, In International Conference on Learning Representations, 2020.
]Search in Google Scholar
[
Tokuda F., Shogso A., Kosuge K., Convolutional neural network-based visual servoing for eye-to-hand manipulator, IEEE Access 9 (2021): 91820-91835.
]Search in Google Scholar
[
Yu C., Cai Z., Pham H., Pham Q.-C., Siamese convolutional neural network for sub-millimeter accurate camera pose estimation and visual servoing, In Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Nov. 2019, pp. 935-941.
]Search in Google Scholar