Zacytuj

Introduction

A disaster, in general, is a sudden calamity that causes significant damage to people's lives and property. Natural disaster damage is not quantifiable and varies according to geographical location, climate, and earth surface type/vulnerability. In general, a disaster has the following effects on the affected areas:

It completely interferes with one's normal daily life.

It has a negative impact on emergency response systems.

It results in the deterioration of the mechanisms in place for attending to the needs and processes whose satisfaction would be essential and indispensable for the sustenance of life, such as nutrition, health, and shelter, with the extent of the deterioration depending on the severity and intensity of the disaster. A disaster may well be described as a severe disturbance in society as well as a loss in environmental resources, the harm inflicted by which overwhelms the ability of society to continue coping with its limited resources.

Many natural disasters occur in the modern world, resulting in the loss of life. We can’t stop natural disasters from happening; however, we can assist those who are affected by them. Natural disasters cause widespread devastation [1], which reduces the amount of time available to search for and locate survivors. During that time, it becomes difficult for the rescue team to visit every location. Accordingly, to arrive at any given location, the rescue operation always takes what may be construed as a prolonged duration of time starting from the time of disaster. This situation provides an opportunity to develop a system for rescuing humans while they are still alive. The rescue robot can play an important role in this situation because it is controlled by a human via radio control. The robot can transmit live images and detect the humans trapped beneath debris.

The detection of humans trapped alive beneath debris [2] is a main role of the search and rescue (SAR) robot that works in both public and private settings. Apart from safety concerns, the robot can use this data to plan and accept its next movement in the surroundings by differentiating between humans and non-animated objects. Onboard sensors are used by the robot to observe the surrounding environment and estimate the current location [3] of all relevant moving objects within range. Tracking analysis can be used to classify these objects and determine whether they are humans or not. Thanks to recent research and development in mobile robotics and improvements in the computational power of our devices, robots [4] can now process large amounts of obtained data from the sensor, resulting in the improvement of awareness concerning the situation. As a result, robots have come into contact with a variety of other ground vehicles and semi-autonomous vehicles.

According to a survey conducted with regard to the issue, although sensors are effective at predicting disasters and managing the pre-disaster stage by detecting water levels in floods and fire temperatures, their importance, effectiveness, and greatest impact lie in locating survivors and saving their lives by detecting vital signs [5]. SAR operations in response to natural or human disasters have the primary goal of discovering and saving potential victims as quickly as possible; therefore, quick response and accurate actions are required. While normal communications may be disrupted, a wireless sensor network can be used to assist the rescue crew. This type of network, among other benefits, permits data collection close to events and allows persistence over time [6]. The topic of monitoring greatly benefits from the development of wearable sensors as portable electronic devices. Different systems have been suggested to measure various parameters [7]. Because of their lightweight design, flexible performance, and ease of processing into diverse structures, fiber-based strain sensors are regarded as a crucial component of smart wearables [8].

Recent developments in the field of artificial intelligence (AI) have made it possible to build autonomous machines, robots, and gadgets that are notably characterized by the capacity to make choices and carry out activities without the involvement of humans [9]. Applications for underwater wireless sensor networks (UWSNs) include underwater navigation, environment monitoring, and disaster management. Since UWSN nodes are constrained to internal batteries, efficient use of the energy supply becomes crucial [10]. A low-cost robotic device with Raspberry PI as an intermediate processor enables post-earthquake human live detection, location tracking, environmental monitoring, and rescue operations. It utilizes radar communication in areas with limited internet access, enhancing SAR efficiency [11]. The system uses an ARM7 controller, a low-cost camera, and sensors for human detection. An unmanned vehicle with an infrared (IR) sensor navigates debris, transmitting vital data via Zigbee for swift rescue operations [12]. A tested sensor system for detecting humans under rubble was found effective, utilizing CO2 sensors for casualty localization and a microphone-based voice recognition algorithm. The limitations of this system include the difficulty in enabling gas sensor use in open spaces and the limited access of thermal cameras [13].

The application of wireless sensor networks in mobile rescue robots improves disaster response, ensuring no one is left behind. This system monitors human presence, weather conditions, and victim location, enhancing data transmission efficiency and real-time monitoring [14]. This is a cost-effective rescue robot that utilizes sensors for detecting living humans who are trapped beneath debris and relaying information to the rescue team, and thus offers enhanced reliability. In addition, enhancements are offered too, like IR camera, global positioning system (GPS) system for precise location, global system for mobile communication (GSM) module for extended communication range, and metal/bomb detectors for added protection [15]. The Human Detection Robot detects human presence by transmitting signals from a transmitter to a receiver and notifying the user with continuous buzzing. It moves in all directions to expand the detection area, automatically navigating around obstacles [16]. The PC-managed robot uses a passive infrared (PIR) sensor to detect living humans by sensing their IR radiation, transmitting their location wirelessly. This robot is suitable for disaster zones and hazardous areas like boilers and reactors [17]. An autonomous vehicle with advanced sensors and powerful motors saves lives during earthquakes, surpassing manual efforts in speed and effectiveness [18]. A microwave-based life-detection system locates buried humans through rubble or other barriers. Operating at 1150 MHz or 450 MHz, it detects breathing and heartbeat signals. Signal processing can address background noise and operator interference [19].

A new approach for detecting living humans in disaster scenarios combines sensors, the ATMEGA16 microcontroller, GSM technology, and programmable logic controller (PLC) systems. This user-friendly, economical, and efficient device provides timely rescue alerts and utilizes a range of sensors for detection [20]. A rescue robot aids in various rescue operations, such as mining, urban ruins, and emergencies, reducing personnel requirements and accessing inaccessible areas. It utilizes PIR sensors, radio frequency (RF) technology, and ARM7 microcontroller unit (MCU) for motion detection, control, and precise location tracking [21].

Various models have been developed to detect trapped humans during natural disasters and calamities. These models leverage advancements in technology, such as high-speed technologies and growing computer capacity, to address challenges involving the identification of human presence under debris and in hard-to-reach areas [22]. Ensuring secure over-the-air (OTA) software updates in smart vehicles is crucial for preventing accidents, protecting lives, and minimizing property loss. Focus is placed on maintaining data integrity and service integrity to enhance OTA software update service security in smart vehicles, considering their impact on future vehicular applications [23]. Linearization schemes address nonlinearity in sensing devices, enhancing measurement accuracy. Different methods for linearizing sensor characteristics are explored, including analog and digital approaches. Analog methods remain popular, while digital methods offer advantages in terms of reduced time, cost, and improved accuracy through software techniques [24].

Automation technologies leverage cameras and sensors to collect data and recognize human activities. Protecting data and preserving privacy are paramount, especially when extracting sensitive information. Jung's analysis [25] explores current research trends, techniques, and challenges in privacy-preserving human and human-activity recognition. On the other hand, modern visual tracking systems rely heavily on the object detection system's efficiency and robustness. The study by Hernández and Sallis [26] introduces a novel Bernoulli filter with determinantal point processes observations; this proposed observation model effectively selects groups of detections based on high detection scores and low feature correlation, enhancing the filter's robustness. The global aviation industry has adopted new technologies to cope with the impact of the COVID-19 pandemic. An intelligent mobile robot system is introduced for guiding passengers at busy airports through voice communication and face detection. The system demonstrates effectiveness in guiding passengers to desired areas, as evidenced by the implementation process and evaluation [27]. The autonomous group interactions for robots (AGIR) framework enables a robot to detect and join social groups based on F-formations, utilizing onboard sensors. It detects individuals, estimates positions and orientations, identifies groups, and suggests robot positions. Simulated evaluations demonstrate high accuracy and real-time performance, validating the framework's effectiveness [28].

Wearable devices have diverse functions and applications, particularly in the medical and health fields. Cheng et al.'s review [29] highlights recent advancements in wearable sensors and their role in monitoring physiological signals, evaluating body movement, and assessing environmental quality. The goal is to provide insights and directions for future development and wider adoption of wearable devices across various domains [29]. By utilizing a Fourier series and a generalized likelihood ratio test, the system accurately identifies breathing patterns and estimates the breathing rate [30]. From common electrical appliances to flourishing intelligent robots, the touch sensor medium has been primarily used for information transmission between humans and machines. The unavoidable mechanical wear and user-to-user transmission of pathogenic microorganisms are the only two serious issues that might arise from this type of direct interaction [31]. In social settings and for collaboration between teams of humans and robots, a robot's capacity to recognize and join groups of people is becoming increasingly crucial [32].

Related Works

Several surveys and reviews on ground robot navigation using sensors in both structured and unstructured environments have been published in the literature. For this purpose, the study of Kabilan et al. [33] introduces the use of a Raspberry PI with a midrange processor, which was used to develop a robotic device that can capture human lives upon earthquake detection, in addition to being capable of GPS tracking, monitoring of environmental present condition, and live video streaming. Many existing systems’ advantages have been validated and covered by the developed device. In the event of natural disaster, the machine can also be used to rescue people. Due to the lack of Internet access in these areas, radar communication is preferred over Internet of Things (IoT) in a real-time system. For this project, Kabilan et al. [33] developed a low-cost, self-contained rescue robot. With the help of this robot, the success and efficiency of SAR operations can be improved. The module's functionality was double-checked after it was assembled. The robotic device is capable of detecting humans remaining alive in a post-earthquake situation; additionally, the device was also endowed with the capabilities for GPS tracking, monitoring the environmental condition, and live video streaming, which were imbibed in it through the use of Raspberry PI. It has been tested and found to outperform a number of already developed systems. The device can also perform rescue operations in the event of natural calamities. Radar communication would be preferable to IoT in a real-time system because Internet access would be limited in these areas.

Uddin and Islam [34] developed a semi-autonomous mobile rescue robot with PIR sensors that can detect living humans in an inaccessible location in a disaster zone. The robot is controlled and communicated with using a joystick and RF technology. A gas sensor detects gas leaks inside the building, while an ultrasonic sensor detects obstacles in the robot's path of travel. Additionally, internet protocol (IP) cameras are used to monitor and analyze conditions that will assist human detection in the most flexible and successful manner possible.

In Muralidhara et al. [35], the authors proposed a system for the detection of living humans that captures images and detects human bodies in disasters using an ARM7 controller and a low-cost camera with high efficiency. In this example, the Viola–Jones algorithm is implemented using MATLAB software. The information about an active human presence will be updated using a set of sensors, including a pulse sensor and a temperature sensor. The IR sensor is also used to avoid obstacles. As soon as it detects a human alive, it uses a sensor to check the affected victim's pulse rate and body temperature, and then sends this information, as well as the GPS location, to the base station through ZigBee module. This method necessitates far fewer images and lesser data collection for processing during transmission; additionally, it is also characterized by the benefits of a reduction in power and a minimal cost of image processing. Finally, it will use a ZigBee Transceiver to receive the affected human's longitudinal and latitudinal location and display it on a monitor to expedite the rescue operation.

In the research of Will et al. [36], a millimeter wave radar system operating at 24 GHz is proposed for detecting, tracking, and classifying human targets. It utilizes linear frequency modulation, multiple chirps, and advanced signal processing algorithms to accurately measure distance, angle, and velocity. The system successfully detects and tracks multiple humans while distinguishing them from other targets. Izzeldin et al. [37] describe a mobile robot system with camera-based target detection, as well as thermal and ultrasonic sensors, that has been developed for autonomous rescue missions in disaster areas. It effectively detects trapped human victims, enables obstacle avoidance, and provides real-time monitoring through a wireless sensor network, ultimately reducing human losses and mitigating the impact of disasters. A device is developed to locate trapped individuals in collapsed buildings during natural disasters. By detecting volatile organic compounds (VOCs) and carbon dioxide levels, the device can determine the presence and approximate location of trapped individuals, improving rescue missions and potentially saving more lives. In the research of Sikhin and Harikrishnan [38], it was ascertained that this system holds applications also in security and surveillance. According to Lu et al. [39], a high-sensitivity humidity sensor based on grapheme-anchored polyamide 66 enables noncontact human–machine interaction (HMI). This revolutionary technology allows for noncontact asthma detection, remote alarm systems, and touchless interfaces in medicine delivery. It denotes the ushering-in of a new era of HMI, reducing mechanical wear and cross-infection risks, and offers a strategy for developing smart electronics with noncontact interaction.

Forest fires pose a serious threat to the environment and biodiversity. Early detection is crucial, and an automated fire alert system is proposed in the study of Dasari et al. [40]. This system utilizes smoke and fire sensors to detect changes in physical quantities and remotely alerts users via a GSM module. It was rightly pointed out by Dasari et al. [40] that this approach improves forest fire surveillance and enables timely intervention to mitigate damage. Seth et al. [41] propose that an autonomous vehicle is realized by employing a convolutional neural network (CNN), Raspberry Pi 4, and a 1/10th scale radio-controlled (RC) car. The system incorporates a camera and ultrasonic sensor for enhanced functionality. Noteworthy aspects include system design, computer-aided design (CAD) modeling, and track construction, enabling the training and testing of the vehicle's self-driving capability. This implementation holds potential for applications in education and robotics, as well as among autonomous vehicle enthusiasts.

Dong et al. [42] assert that unmanned aerial vehicles (UAVs) with thermal imaging are essential for rapid SAR during natural disasters. A new thermal image dataset captured by drones is utilized, training survivor detection models such as YOLOV3 and YOLOV3-MobileNetV1. The network is optimized for real-time performance on NVIDIA's Jetson TX2, enabling a survivor detection system on DJI Matrice 210 and Manifold 2-G for post-disaster SAR operations. As emphasized by Gunawan et al. [43], methods for measuring high dynamic range (HDR) image quality are crucial for enhancing user satisfaction in HDR-based visual services. The high sensitivity of the human visual system to distortions in HDR images poses a challenge. Two common methods of HDR image generation are discussed, and the predominance of full-reference and no-reference quality models is highlighted. There is potential for the development of reduced-reference HDR image quality assessment approaches. A low-cost autonomous underwater vehicle (AUV) is developed with 6-DOF capabilities, employing sensors like a low-cost inertial measurement unit (IMU), magnetometer, and water pressure sensor. Onboard instruments and a PC104-based data logger collect and process data, while real-time validations through hardware-in-the-loop (HIL) simulations confirm the feasibility of the identification methods. It was declared by Hassanein et al. [44] that experimental results demonstrate the effectiveness of the adaptive controller in controlling the AUV's dynamics in various conditions. Abdul Rauf et al.'s study [45] mentions that the position estimation in UAVs without GPS using simultaneous localization and mapping (SLAM) techniques has been a prominent research area in aerial robotics. Existing methods employing sensors like red green blue-depth (RGB-D), light detection and ranging (LiDAR), and ultra-wideband (UWB) are reviewed, with probability-based SLAM (linear Kalman filter [LKF] and extended Kalman filter [EKF]) being commonly utilized. Aerial robots have various applications such as rescue, transportation, and military operations.

In the study of Chowdhury et al. [46], a solution has been implemented for the creation of a rescue robot for detecting humans alive in the rubble. According to the authors, each country is to build a rescue machine that can crawl through earthquake and landslide debris while maintaining signal strength with low-cost boosters. The robot is capable of monitoring audio/video using a digital camera with a light-emitting diode (LED) torch. The vehicle's movement is controlled wirelessly, and it exchanges data with Wi-Fi beacons. It can also provide a more convenient way to move in all types of terrain and locate victims by using a carbon dioxide sensor.

In the study of Denker and İşeri [47], the researchers have developed SALVOR [47], a mobile SAR. SALVOR's operation is based in part on data from its environmental sensors and in part on instructions from human operators. The robot's movements were controlled by a remote controller. Control actions in a robotic system are implemented using Arduino. To receive control commands, the robot communicates with the operator. The operator can decide what to do with the environmental data that the robot receives from its sensory components. After a decision to take action is made, this process continues to work for continuous movement. It does, however, provide information about its surroundings in order to gain access to the situation.

Hatano [48] developed a robot with two arms – this, in turn, will be placed on a moving robot and will perform rescue tasks under harsh conditions. However, blemishes and defects on the road surface can be expected to cause undesirable vibrations as well as jerking motions when the robot is moving on rough terrain; such a characteristic would imply that usage of this device, especially with regard to its particular intended utility in rescuing humans trapped under debris and transporting them to conditions where medical facilities would be available, might cause a negative effect on the victim being rescued, the rescuer, and even the robot itself. Thus, in order to address this limitation, the author proposed a joint module that reduces impact vibration from the environment by using magnetorheological (MR) fluid.

Yan et al. [49] proposed a robot that can detect living humans. During SAR operations, this provides invaluable assistance and protection to rescue workers. This was accomplished by sensing the body temperature of living humans without actually touching them. In this robot, a thermal sensor, a microcontroller, a gas sensor, an high-definition (HD) wireless camera, and a variety of other modules collaborate to find living victims. To operate the robot's movements and display sensor output values, an Android application is being developed.

Methodology

This developed robot is controlled by a joystick, which allows the user to maneuver it easily. The RF module is placed to ensure that the system's data are transferred reliably within the disaster zone. Although there are a variety of sensors and features on the market for urban search and rescue (USAR) robots, they are prohibitively expensive. On the other hand, the sensors used in this project are low-cost and widely available. To reduce power consumption and improve rescue speed, the authors of this paper developed a system with two levels of human sensing. A PIR sensor is the first level that detects humans through their radiated IR wave, and an IP camera on the second level confirms the presence of humans in disaster-affected areas. The system is reliable for rescue missions due to the two-level human detection system (Figure 1).

Figure 1:

Identification of victims in two levels.

Viola–Jones Algorithm

The MATLAB software is used for image processing algorithms. The Viola–Jones algorithm is used to detect objects in the vision cascade object detector. This algorithmic approach can detect a wide range of modules, including human faces and upper bodies. Accordingly, the algorithm-based upper body model is used to enable the robot's visual system (including the camera and the logical circuitry triggering an alert in the case of finding of a potential match) to recognize humans buried or trapped in the debris.

This algorithm sends a signal to the ARM7 micro-controller via the universal serial bus – universal asynchronous receiver and transmitter (USB-UART) cable when it detects a human body. When the system detects a human body, this information will be displayed on the liquid crystal display (LCD), as a means of sharing the output with the controller. The vehicle then moves a short distance toward the injured person. Then, using a temperature sensor, pulse sensors are used to check the affected human's pulse rate in order to determine whether or not the human is alive. An unmanned vehicle [50] with a GPS module receives latitude and longitude coordinates from a satellite. When the face is detected, the ARM7 micro-controller receives a control signal. Once information is received that a human has been discovered in a disaster area, the vehicle's trip to the precise spot is then initiated, so that the web camera attached to the vehicle can capture a more accurate image of the victim. The Viola–Jones algorithm would then be used to identify the living human displayed in the image.

Signal Strengthening with Dijkstra's Algorithm

The system calculates and stores the shortest distance between the beacons and the main routers using Dijkstra's algorithm. The shortest path is used to relay the signal from the robot to the base. The authors’ main goal was to build an all-terrain robot that could traverse a variety of surfaces, as well as climb to new heights and inclined planes.

Detection of Human Presence and Indoor Localization

CO2 sensors and a thermal camera are used to accomplish this task. When the CO2 level rises above a predetermined threshold, the sensor activates the buzzer, signaling the presence of humans or animals exhaling CO2. Furthermore, the thermal camera detects heat emitted from the human body, allowing the system to distinguish between a dead and living person.

SAR Using RF Technology

Efforts were made by Sharmin et al. [51] to develop a low-cost SAR. Since inexpensiveness of the end-product was a principal objective motivating the development of this project, the researchers ensured that it was created to be low-cost and simple to build using readily available components, thus making it suitable for developing countries like Bangladesh. Because it can be controlled wirelessly [52] from a station, without requiring specialized knowledge in robot operation, this developed robot will provide valuable information to human rescuers. It uses the control station to send continuous thermal and video feeds, as well as sensory data from the environment, from the disaster area to the linked web applications. Using a powerful remote-controlled flash light, it can run through rough terrain in complete darkness [53]. It can detect and communicate with both living humans and rescuers within the rubble. This robot has a variety of features that can be viewed.

Actually, there are three types of parts that make up a robot design: mechanical parts, electrical parts, and transmission parts. When it comes to mechanical parts, the presently discussed robot has four wheels on which electrical components were mounted using screws and a polyvinyl chloride (PVC) board on the body of the vehicle. It also has four independent suspension springs that are controlled by radio link. It can therefore raise itself up slopes of up to 60° and run easily in a variety of terrain conditions.

The electrical components include microprocessors and Arduino UNO, which are commonly used electrical components. Also, some gas sensors are used to detect the air quality and CO2 level of the atmosphere within the debris, the intention being to ascertain whether or not the humans inside are still alive; a greater proportion of CO2 inside may be a tentative indicator that there is respiration going on, and therefore that many of the humans trapped inside must still be alive. Various mechanisms can be used to translate the CO2 levels in the air into digital information, so as to render it suitable for transmission; for example, the MQ-7 gas sensor outputs an analog voltage based on the amount of CO2 in the air. Also, some barometric sensors, such as the BMP-280, were used to determine the bot's altitude, indicating which floors it was on. Thermal camera sensors were also used, arranged in the form of an 8 × 8 array grid-eye that is capable of detecting humans within a 2-m range. Further, a 4-W LED flash light, which can be switched on and off from a distance, was used. Sharmin et al. [51] further decide to include a small audio signal receiver (walkie-talkie) on board that can communicate with another one at the station within 1 km. As a result, rescuers will be able to communicate with the people at the station.

This robot uses four transmission channels in its transmission parts, but each channel operates at a different frequency range. The transmitter and receiver on Channel 1 use a 2.4 GHz frequency; and this has as its purpose the robot's movements in response to user commands. With a frequency range of 433.4–473.0 MHz, Channel 2 of the onboard HC-12 transceiver module transmits all sensor data to the station. Channel 3, which operates at 5.8 GHz, involves the transmission of audio/video signals generated by a first-person view (FPV) wide-angle camera with a microphone to the receiver station. Channel 4, which operates at a frequency of 462.5–462.6 MHz, is dedicated to the processing of human audio signals as part of the walkie-talkie system mentioned above.

Flow diagram

The robot can be deployed at any point on the site in this flow diagram, while the operator remains safe on the outside. It can also transmit audio signals from sites whose location or present condition would necessitate assistance in establishing communication with the outside world, in pragmatic terms with the operator of the robot or the team directing the remote phase of the relief operations. Subsequently, the FPV and thermal cameras were placed in different locations. Now, the robot is perfectly equipped to send the altitude of the human it finds to the rescue team. There is also a website with two video feeds: one from the thermal and the other from the FPV camera. Finally, the battery has a minimum run time of 30 min (Figure 2).

Figure 2:

Flow diagram for rescue mission.

Sensor/sensing technology Pros Cons Ref.
Radar Long rangeNo need for line of sightOffers possibility for computation of target velocityTarget tracking is possible It is expensiveInterference [4],[54]
LiDAR Familiar in roboticsProvides detailed information about the environment High costNot able to measure distance given the prevalence of conditions of heavy rain [2],[55]
Magnetic It is able to detect metal objects Small range [56]
ToF camera Possible to provide in 3D measurements Less accuracyResolution is lowNot possible to deploy in outdoor operation [57]
Acoustic Wide rangeCost effective Differs in acoustic characteristics based on different environments [58]
Ultrasonic Wide rangeCost effective Sound is absorbed by clothing and foliage [34],[59],[64]
Optical Wide rangeTarget identification is easy CostlyNeed of line of sight [60]
IR and thermal Possibility of detecting a target in the dark In hot environments, detecting the target is difficult [34],[51],[47]
RF Easy to installAffordableLong distance Cables are required along the perimeterVolumetric range is limited [34],[51],[47],[61]
Motion It is possible to classify the type of intrusion based on structuresCost effective Limited range [34],[35],[64]
Seismic Exceptional stealth Differs in each environment [62]

IR, infrared; LiDAR, light detection and ranging; RF, radio frequency; ToF, Time of Flight.

Sensors Used in Robots

Many sensors are employed in robots to execute specific tasks, and we will discuss the sensors and their functions here.

Analysis of Human Detection Sensor with Graph

Various kinds of sensors, presented in the tables forming part of the forthcoming sub-sections, can be used to detect living victims under the rubbles. If our robot is required to work in a desert and transmit temperature data, a temperature sensor is a simple solution. Low temperature sensor integrated circuits (ICs) produce a voltage difference in response to temperature changes. The ICs LM34, LM35, TMP35, TMP36, and TMP37 are some of the most commonly used temperature sensors (Figures 3–5).

Figure 3:

Temperature sensor.

Figure 4:

Gas sensor.

Figure 5:

PIR sensor. PIR, passive infrared.

The diagram below (Figure 6) shows how sensors perform detection under normal conditions, as well as the results obtained when they are navigated by 1 km, indicating the level of sensing under both classes of scenario. Additionally, the presented graph also shows differences in value obtained corresponding to when the human body is still, slowly moving, or burned.

Figure 6:

Percentage analysis graph for human detection sensors.

Table for Analysis of Comparison Factor
S. No. Paper name with Ref. No. Comparison factor

Obstacle detection Microphone Location tracking Environmental condition monitoring Live streaming Gas detection Pulse sensing Temperature sensing
1 Living human detection robot in earthquake conditions [33]
2 SAR system for detection of living humans by semi-autonomous mobile rescue robot [34]
3 Unmanned vehicle for detection of living humans during calamity [35]
4 Terminal analysis of the operation of a rescue robot constructed for assisting secondary disaster situations [46]
5 A low cost USAR robot for developing countries [61]
6 Design and implementation of a semi-autonomous mobile SAR robot [47]
7 Disaster response and surveillance bot [61]
8 Ground robot for detection of living humans in rescue operations [63]

SAR, search and rescue; USAR, urban search and rescue.

Table for Analysis of Technology Comparison Factor
S. No. Paper name with Ref. No. Technology comparison factor

RF module Bluetooth WiFi IoT Zigbee module Android app
1 Living human detection robot in earthquake conditions [33]
2 SAR system for detection of living humans by semi-autonomous mobile rescue robot [34]
3 Unmanned vehicle for detection of living humans during calamity [35]
4 Terminal analysis of the operation of a rescue robot constructed for assisting secondary disaster situations [46]
5 A low cost USAR robot for developing countries [61]
6 Design and implementation of a semi-autonomous mobile SAR robot [47]
7 Disaster response and surveillance bot [61]
8 Ground robot for detection of living humans in rescue operations [63]

IoT, Internet of Things; RF, radio frequency; SAR, search and rescue; USAR, urban search and rescue.

Comparative Study

This comparative study will provide information on the difficulties associated with human detection in a variety of areas and will assist in future measures aimed at overcoming the drawbacks.

S. No. Paper name Pros Cons Ref.
1 Living human detection robot in earthquake conditions Low costMore efficientLive streamingMore suitable for landslides/avalanches Due to the lack of internet connectivity under circumstances of landslides and avalanches, radar communication is preferred rather than IoT [33]
2 SAR system for detection of living humans by semi-autonomous mobile rescue robot Low costMore reliableUsed sensors are cheap and easily availableLow power consumptionHigh efficiency No location tracking and environment monitoring [34]
3 Unmanned vehicle for detection of living humans during calamity Low costMore reliableLong distance communicationLow power consumption No live streaming and environmental monitoring [35]
4 Terminal analysis of the operation of a rescue robot constructed for assisting secondary disaster situations Optimum size and strengthSimple and more reliableSimple to navigate in all types of terrain No location tracking [46]
5 A low-cost USAR robot for developing countries With walkie-talkie on board, rescuers can communicate up to 1 km away No location tracking [61]
6 Design and implementation of a semi-autonomous mobile SAR robot Built with a CMOS camera for digital image production Camera connection is lost and major issues on battery power [47]
7 Disaster response and surveillance bot Low power consumptionEasily controllableAdditionally, flame sensor is used to detect presence of fireLow cost and affordable No location tracking [61]
8 Ground robot for detection of living humans in rescue operations More accurate and efficientImproved resuscitation services for catastrophic victims No environmental monitoring and obstacle detection [63]

CMOS, complementary metal oxide semiconductor; IoT, Internet of Things; SAR, search and rescue; USAR, urban search and rescue.

Future Technologies

AI, the IoT, Big Data, and blockchain are all technologies that are becoming increasingly sophisticated. As a result, these technologies have the potential to help drastically increase disaster response and relief capabilities. When innovative concepts and highly developed technologies are brought together, new developments in crisis management are produced.

Applications

In military contexts especially, identification of the presence of living humans in a certain area assumes a vital importance. Further, bots can be used for cartography and navigation in disaster zones in the aftermath of catastrophic events such as explosions, accidents, terrorist attacks, and natural catastrophes. Robots are providing assistance to firefighters in various countries throughout the world. AI is being utilized by bomb squads to assist in the defusing of explosives and the disposal of hazardous materials. There are also developments that are underway involving the application of AI in the aftermath of natural disasters to search for people by listening for their breathing and monitoring their heartbeat for signs of life. Drones are used to provide supplies such as food, water, medication, and other materials, which can be delivered remotely.

Conclusions and Future Work

The focus of this article is on sensor-based SAR jobs, specifically the mechanical structure, human detection abilities and correspondences, administrator interface, and programing design. Human detection is a difficult task that requires a significant amount of time to complete. To expedite the process and thus improve the likelihood of reaching a greater number of injured persons alive, thereby saving precious lives, we may recruit the aid of robotics technology, which, through remotely controlled operations, can be deployed in surveying the disaster area for signs of life, so that human intervention can be directed at the earliest toward those regions in the most urgent need of the same. Specifically, detection of living humans lying buried in rubble is accomplished using a sensor built into a robot, which traverses the disaster-affected area and scans for the presence of life based on various revelatory parameters such as CO2 content in the air, heat signatures, etc., thereby detecting humans needing urgent aid and saving life. Furthermore, technological superiority ensures that the sensory data sent to the base station are accurate. This analysis aids in identifying the challenges that are faced when performing detection operations aimed at the discovery of living humans, affected by calamity with the result of being rendered immobile and/or unable to communicate, in disaster-afflicted zones. In the future, we will be able to use the MATLAB simulation software tool to process images captured by robots and develop an algorithm for evaluating image parameters. Furthermore, we shall pay attention to power consumption as well as formulate an effective means to reduce it.

eISSN:
1178-5608
Język:
Angielski
Częstotliwość wydawania:
Volume Open
Dziedziny czasopisma:
Engineering, Introductions and Overviews, other