Cite

Introduction

Currently, technology is advancing rapidly. The application of knowledge is streamlining human tasks, prompting researchers to explore technologies that enhance human life. One notable development is the autonomous vehicle, denoting a system with self-driving capabilities. In simple terms, an autonomous vehicle is a vehicle capable of operating itself without human intervention.

Numerous literature sources explore using autonomous vehicles in various applications. For instance, the study [1] employs an Unmanned Ground Vehicle (UGV) to dispense therapeutic vaporized essential oils or medications to combat and prevent COVID-19 infections. Another intriguing aspect worth noting is the domain of formation control, or consensus control for mobile robots, as studied by Andaluz et al. [2]. Given the numerous advantages and possible uses of autonomous vehicles, it's understandable that investment in Audio Visual (AV) technology has risen over the years. Researchers and developers collaborate to advance AVs, gaining insights into their functioning and implications [3].

One crucial component of the autonomous vehicle is its perception and localization system [4]. This system integrates sensors installed in the vehicle that assess its surroundings. In autonomous vehicles, perception and localization systems usually adopt sensor fusion methods [5] that merge data from multiple sensors to achieve more precise outcomes.

For over a quarter-century, simultaneous localization and mapping (SLAM) has received substantial attention within the mobile robotics community [6]. The ongoing success of this field is closely linked to the fact that solving the SLAM problem (locating a robot while incrementally constructing a map of its surroundings) offers a wide array of applications, ranging from exploring new spaces to enabling autonomous driving. The recent focus on autonomous vehicles has catalyzed more research efforts with the contributions of car manufacturers [7].

Though the global navigation satellite system (GNSS) may seem like a solution to the localization challenge, it has proven inadequate. However, even with enhanced precision using well-positioned base stations (real-time kinematic GNSS), signal availability remains problematic due to unpredictable atmospheric conditions and signal blockages caused by infrastructure [8]. This problem can lead to errors in localization, especially in urban environments with high buildings that obstruct satellites. In contrast, GNSS tends to perform better on open roads. Then, the SLAM framework addresses this issue while remaining versatile enough to accommodate any sensor or estimation method that fulfills the requirements of SLAM. In autonomous driving, the map becomes crucial since it provides initial perception to conducting decision-making.

SLAM is essential in robotics, particularly for mobile robot systems like autonomous vehicles. Its primary goal is to simultaneously determine the robot's position and construct a representation of the surrounding map [9, 10]. The need for using the environment map in SLAM is based on two reasons [11, 12]. The first reason is that a map often aids other tasks, such as guiding routes or providing initial guidance. The second reason is that the map helps correct errors in robot state estimation, preventing drift. Without a map, dead reckoning becomes unreliable. Conversely, a map (like identified landmarks) enables the robot to recalibrate its location by revisiting known areas. Hence, SLAM is particularly valuable when no existing map is available and needs to be created [13].

SLAM employs diverse techniques to extract feature information from the environment for mapping [14]. These techniques encompass ultrasonic sensors, infrared sensors, cameras, light detection and ranging (LiDAR), and encoders to obtain initial robot location data through odometry. However, odometry-based approaches tend to accumulate errors over time. Thus, a filtering method for solving the sensors' data errors is required.

The Kalman filter (KF) is the initial category of methods that utilize a filtering approach [7, 15]. The KFs technique operates under the assumption of Gaussian noise impacting data, which might not always hold universally valid in this context. Despite their impressive convergence attributes, KFs are initially designed to address linear system challenges and are seldom applied in SLAM [16, 17]. However, the practical problems of autonomous vehicles in the real environment tend to be nonlinear systems due to the complexity of the environment and the uncertainties [18]. On the other hand, the extended Kalman filter (EKF) is commonly used in nonlinear filtering techniques [19]. The EKF employs linearization by approximating nonlinear systems using a first-order Taylor expansion around the current estimate [20]. EKF's optimality is demonstrated when linearization occurs around the precise state vector value.

Furthermore, this study will combine the EKF with SLAM to enhance the SLAM performance for improved autonomous vehicle mapping through experimental validation in the actual artificial environment at the laboratory scale; as technology advances, autonomous vehicles thrive, relying on perception systems. SLAM, which has a 25-year history of prominence in robotics, aids self-driving and exploration. GNSS has limitations, driving interest in SLAM. Diverse techniques extract features for mapping, but errors can accumulate over time. Then, the KFs and EKFs are received concerned. Note that KFs are suitable linear systems and the EKFs are more promising to tackle the nonlinear problems of autonomous vehicles—finally, the EKF integration into SLAM promises better autonomous mapping in complex real-world environments.

EKF-Slam Systems Modeling

Utilizing the foundations laid by the Linear KF [21], the EKF follows a similar structure, comprising two essential phases: prediction and correction. The EKF technique involves the fusion of information from two categories of position-determining sensors. This filtering technique can achieve optimal assessments of state parameters, even in the presence of measurement disruptions. The employment of EKF is necessitated by the fact that the kinematic model of a mobile robot exhibits nonlinear traits, demanding a distinctive strategy. For effective operation, the EKF mandates the presence of both a system model and a measurement model. Based on the study [22], the prediction state is given in Eq. (1), and the estimated covariance error is shown in Eq. (2).

β^k=f(β^k1+,Vk) \hat \beta_k^- = f(\hat \beta_{k - 1}^+ ,{V_k}) Pk=FkPk1FkT+Qk P_k^- = {F_k}{P_{k - 1}}F_k^T+ {Q_k}

The correction or the updating phase is split into three stages: (1) calculating the Kalman gain as shown in Eq. (3), (2) updating the state estimation using measurements as given in Eq. (4), and (3) the corrected error covariance related to measurements using Eq. (5). The detail parameters of the Eqs. (1)–(5) are presented in Table 1. Then, the steps for the EKF-SLAM algorithm are shown in Table 2.

Kk=PkHkT(HkPkHkT+Rk)1 {K_k} = P_k^- H_k^T{({H_k}P_k^- H_k^T+ {R_k})^{- 1}} β^k+=β^k+Kk(zkh(β^k)) \hat \beta_k^+ = \hat \beta_k^-+ {K_k}\left({{z_k} - h(\hat \beta_k^-)} \right) Pk+=PkKkHkPk P_k^+ = P_k^- - {K_k}{H_k}P_k^-

The EKF parameter details.

Parameter Information
β^k \hat \beta_k^- The estimated state
f Non-linear model system
h Measurement model system
vk Measurement noise
zk IMU measurement
Wk − 1 Dynamic system noise
β^k1+ \hat \beta_{k - 1}^+ The estimated updates state
V The linear velocity of the robot
Δt Time derivative
θ The steering angle of the robot
Qk Dynamic system noise matrix Q=[10400000104000001040000010400000104] Q = \left[ {\matrix{{{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill \cr}} \right]
Hk IMU measurement matrix Hodometry+IMU=[1000001000001000001000001] {H_{{\rm{odometry+}}IMU}} = \left[ {\matrix{1 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 1 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill \cr}} \right]
Rk Measurement noise matrix R=[10400000104000001040000010400000104] R = \left[ {\matrix{{{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill \cr}} \right]
Fk Jacobians matrix, Fk=[10Vk1ΔtsinθkΔtcosθk001Vk1ΔtcosθkΔtsinθk00010Δt0001000001] {F_k} = \left[ {\matrix{1 \hfill & 0 \hfill & {- {V_{k - 1}}\Delta t\sin {\theta_k}} \hfill & {\Delta t\cos {\theta_k}} \hfill & 0 \hfill \cr 0 \hfill & 1 \hfill & {{V_{k - 1}}\Delta t\cos {\theta_k}} \hfill & {\Delta t\sin {\theta_k}} \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill & {\Delta t} \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill \cr}} \right]

EKF, extended Kalman Filter; IMU, inertial measurement unit.

The steps of the EKF-SLAM Algorithm.

EKF-SLAM Algorithm

The initial step is to initialize the previous estimated value (b_prev) and the previous covariance error (p_prev) with a value of 0.

Initialize the predicted state (b_new) based on Eq. (1)

Initialize the prediction error covariance (p_new) based on Eq. (2)

Obtain the optimal gain (K) based on Eq. (3)

Obtain an estimate of the state of the update (b) based on Eq. (4)

The estimation of the state of the update is the result of the KF displayed

Get the updated covariance error (K) based on Eq. (5)

The updated state estimate value is stored as the previous estimate, and then the state estimation algorithm returns to step 2

The error covariance update value is stored as the prior covariance, and then the error covariance algorithm returns to step 3.

EKF, extended Kalman filter; KF, Kalman filter; SLAM, simultaneous localization and mapping.

Methodology

The research was conducted in two phases. The initial stage involved the validation of various algorithms within a three-dimensional virtual environment, specifically employing Gazebo on a Robot Operating System (ROS). The primary objective of this stage was to determine whether the algorithms were prepared for implementation on real hardware. During this simulation phase, a simplified model representing the actual specifications of the autonomous vehicle robot hardware was utilized. This approach ensured the algorithm design and programming framework aligned with real-world conditions. Once the simulation proved successful, the subsequent validation entailed conducting physical experiments with real autonomous vehicle robot hardware.

Subsequently, various data-gathering procedures were conducted via experiments. These experiments encompassed LiDAR mapping, testing the inertial measurement unit (IMU), including measurements from the gyroscope and accelerometer, conducting encoder tests for each wheel, and performing sensor fusion measurements using the EKF. Additionally, the findings from this data collection were juxtaposed with existing studies that had previously explored related subjects.

Prototype set-up

This study developed a model utilizing a Jetson Nano as a single-board computer (SBC) from NVIDIA [23], Arduino UNO R3 [24], RC ESC TBLE-02S Brushless from Tamiya USA [25], Encoder Package from Waveshare [26], RPLiDAR A1M8 [27], and MPU6050 as an IMU. A car prototype is designed using a Jetson Nano SBC. This setup integrates Tamiya ESC and Waveshare Encoder through Arduino Uno. The Tamiya TBLE-02S ESC drives the motor, while the RPLiDAR A1M8 connects directly to the Jetson Nano through USB. MPU6050 is linked to Jetson Nano by connecting specific pins. Encoders and servo are managed by the Arduino and ESC powered by a battery, driving the Jetson Nano, as shown in Figure 1. Thus, the components that simulate a robotic-scale car or vehicle are organized as illustrated in Figure 2, and the hardware parameters are given in Table 3.

Figure 1:

The hardware wiring and schematics.

Figure 2:

The physical look of a robot.

The details of robot parameters.

Parameter Value
Battery 12.6 V (DC)
Dimension 242.9 × 192.2 mm
Steering servo 9 kg/cm torque
Wheel motor 240 RPM
Wireless 2.4G/5G dual-band WIFI, Bluetooth 4.2
Driving type Ackerman steering dual gearmotor rear wheel drive

RPM, revolutions per minute.

World scenario

This study analyzes the physical world and measurement scenario occurred at the Embedded and Cyber-Physical System (ECS) laboratory within the Department of Engineering Physics, Sepuluh Nopember Institute of Technology (ITS). The vehicle was directed down a straight road to simplify data gathering. A variable-height barrier was placed on the road to align with the vehicle's LiDAR sensor, guiding its path. The road's width and length were 30 cm and 200 cm to accommodate each lane. Additionally, a barrier is procured to encircle the road, facilitating the LiDAR sensor's environmental perception. The physical world and measurement scenario can be seen in Figure 3.

Figure 3:

The scenario of physical world and measurement.

Encoder Combination

In this study, the prototype employs a pair of encoders, the left encoder and the right encoder (as shown in Figure 4). These encoders produce data as ticks in an odometry system, which rise as the servo undergoes rotation. A motor requires an encoder combined with the Tamiya TBLE-02S ESC as its motor driver to support wheel odometry in operating the robot localization.

Figure 4:

Encoders' schematic diagram.

In practical scenarios, odometry is more accurate than speed in representing a vehicle's rotational speed and distance traveled. While both are susceptible to drift and slip, speed introduces additional errors arising from disparities between the actual and the estimated motion and position. As a result, the two-wheel differential drive model is utilized for calculating vehicle odometry, as given in Eq. (6).

ut[vtΔtcosμt1,θvtΔtsinμt1,θωtΔt] {u_t} \approx \left[ {\matrix{{{v_t}\Delta t\cos {\mu_{t - 1,\theta}}} \cr {{v_t}\Delta t\sin {\mu_{t - 1,\theta}}} \cr {{\omega_t}\Delta t} \cr}} \right]

In the Eq. (6), the ωt is the angular speed which is estimated by using Eq. (7). In Eq. (7), B is the distance between two wheels, RR and RL respectively are the radius of two wheels used, and ωtR \omega_t^R and ωtL \omega_t^L respectively are the angular speed of two wheels used. ωtR \omega_t^R and ωtL \omega_t^L can be calculated by using Eqs. (8) and (9), and note that EtR E_t^R dan EtL E_t^L are the ticks of each one revolution of right and left wheel.

ωt=ωtRRR+ωtLRLB {\omega_t} = {{\omega_t^R{R_R}+ \omega_t^L{R_L}} \over B} ωtR=2πEtRETΔt \omega_t^R = {{2\pi E_t^R} \over {{E_T}\Delta t}} ωtR=2πEtLETΔt \omega_t^R = {{2\pi E_t^L} \over {{E_T}\Delta t}}
Localization and Perception

The EKF algorithm's configuration involves the merging of information from the odometry sensor and the MPU6050 sensor. By processing these two datasets through the ROS protocol, the algorithm deduces the estimated position of the vehicle, as illustrated in Figure 5.

Figure 5:

The Localization and perception block diagram. ROS, robot operating system.

Figure 6:

The map generated using the Hector-SLAM algorithm. SLAM, Simultaneous localization and mapping.

Results and Discussions
LiDAR Mapping

In this research, we acquired LiDAR data using RPLiDAR A1M8, which would later serve as the mapping output for the vehicle robot with 360° scanning and ranging of the surrounding environment around the robot, with a maximum distance of 12 m and a ranging frequency of 8000 Hz. Real-time LiDAR data was collected through the ROS platform. Mapping is accomplished using the Hector-SLAM package. The Hector-SLAM algorithm relies on high-resolution, high-frequency laser data excludes odometry information [28], and generates the map as follows.

Table 4 shows the results of measurements obtained by the vehicle robot. This data informs the vehicle's perception. LiDAR constructs maps in the raw environment and visualizes them through the RViz-ROS. The data generate maximum and minimum sensor values in meters.

The maximum and minimum distance values of the LiDAR data.

Maximum (m) Minimum (m)
Measured data 6.808 0.15
Actual value (reference) 6.54 0.09

Error (%) 4 67

LiDAR, light detection and ranging.

IMU

In this study, the MPU6050 is used as the IMU sensor. MPU6050 consists of a gyroscope and an accelerometer. This module sensor is positioned upward. The orientation's reference point initiates program execution. Since the accelerometer faces upward, the X- and Y-axis values become zero, whereas the Z-axis value becomes one. The gyroscope and accelerometer data are given in Tables 5 and 6.

Gyroscope orientation measurements.

X orientation Y orientation X reference Y reference X error Y error
−0.01 −0.01 0 0 −0.01 −0.01
−0.01 −0.01 0 0 −0.01 −0.01
−0.01 0.02 0 0 −0.01 0.02
−0.01 0.03 0 0 −0.01 0.03
−0.01 0.03 0 0 −0.01 0.03
0.00 0.03 0 0 0.00 0.03
−0.01 0.03 0 0 −0.01 0.03
0.00 0.00 0 0 0.00 0.00
0.00 0.00 0 0 0.00 0.00
−0.01 0.03 0 0 −0.01 0.03

Errors in average (%) 1 2

Accelerometer measurements.

X Y X reference Y reference X error Y error
0.02 −0.02 0 0 0.02 −0.02
0.01 −0.01 0 0 0.01 −0.01
−0.03 −0.02 0 0 −0.03 −0.02
−0.06 −0.02 0 0 −0.06 −0.02
−0.07 −0.02 0 0 −0.07 −0.02
−0.06 0.01 0 0 −0.06 0.01
−0.06 −0.03 0 0 −0.06 −0.03
0.01 0.01 0 0 0.01 0.01
−0.01 0.00 0 0 −0.01 0.00
−0.06 −0.02 0 0 −0.06 −0.02

Error in average (%) 4 2
Encoder

The fusion of the dual encoders is executed using Eq. (6). The outcomes are the vehicle's coordinates on the X and Y axes. Sample data from the merged encoder outputs are presented in Tables 7 and 8. Calculated results are compared against a reference to establish the error value.

The comparison of the encoder results and the reference in X-axis.

X position from odometry X reference Error
0.95 0.97 0.02
0.95 0.97 0.03
0.95 0.97 0.03
0.95 0.97 0.03
0.95 0.98 0.03
0.95 0.98 0.04
0.95 0.98 0.04
0.95 0.99 0.04
0.95 0.99 0.04
0.95 1.00 0.05

Error in average (%) 3

The comparison of the encoder results and the reference in Y-axis.

Y position from odometry Y reference Error
0.02 0.02 0.00
0.02 0.02 0.00
0.02 0.02 0.00
0.02 0.02 0.00
0.02 0.02 0.00
0.01 0.02 0.01
0.01 0.02 0.01
0.02 0.02 0.02
0.07 0.02 0.05
0.07 0.02 0.05

Error in average (%) 2
Sensor Fusion with EKF

In this study, the sensor integration involves merging the MPU6050 sensor and the encoder through an EKF filter. The resultant fused data provides the vehicle's coordinates along the X and Y axes. In this experiment, 10 data samples were collected for analysis. The outcomes of the EKF calculations are compared with the references of the vehicle positions. From comparison, the obtained error values for calculating the EKF filter algorithm are presented in Table 9, and the graph of the error representations in the XY-plane is shown in Figure 7.

Comparison between the EKF positions and the reference.

X position (EKF) X reference Y position (EKF) Y reference Error X Error Y
0.97 0.97 0.02 0.02 0.00 0.00
0.98 0.97 0.02 0.02 0.01 0.00
0.98 0.97 0.02 0.02 0.01 0.01
0.98 0.97 0.02 0.02 0.01 0.01
0.99 0.98 0.02 0.02 0.01 0.02
0.99 0.98 0.02 0.02 0.01 0.04
1.00 0.98 0.02 0.02 0.02 0.04
1.01 0.99 0.02 0.02 0.02 0.04
1.01 0.99 0.02 0.02 0.02 0.04
1.02 1.00 0.02 0.02 0.02 0.05

Error in average (%) 1 3

RMSE 0.11 0.15

EKF, extended Kalman filter; RMSE, root mean square error.

Figure 7:

Comparison of EKF position estimation results with references. During the experiment, we focus on two coordinates as the mobile robot moves—namely, the X and Y coordinates. Label A represents the robot's motion along the X coordinate, while label B corresponds to its movement along the Y coordinate. EKF, extended Kalman filter.

Table 9 and Figure 7 illustrate the outcomes of the EKF filtering algorithm in terms of localizing the robot's position. The outputs in Table 8 pertain to the robot's position along the X and Y axes. The X-axis exhibits a 1% average error and root mean square error (RMSE) 0.11 m while the Y-axis exhibits a 3% average error and RMSE 0.15 m. These observations are validated by the graph in Figure 7a. The chart illustrates the EFK filter's output for position estimation (red line) compared to a reference (yellow line). Figure 7a demonstrates a minimal error between the EKF-based estimated and reference positions. These findings suggest the EKF filter algorithm's effectiveness by accurately localizing robot positions within a real-world laboratory setting. The same observations were found for the Y-axis's performance, as depicted in Figure 7b.

Compared to similar studies, like those by Kim and Park [29], which used sensor fusion techniques with LiDAR and radar sensors, employed EKF and fuzzy algorithms, and tested with actual outdoor vehicle prototypes, their experiments achieved an RMSE of 0.38 m for X position and 0.27 m for Y position, based on 1882 position measurements. Indeed, initially, the research outcomes yield a reduced RMSE value, implying superior performance. Nonetheless, it's worth reiterating that these two investigations vary considerably in their sensor type selection, data collection environment, and data quantity. In summary, while our proposed study exhibits a lower RMSE, it's crucial to consider that this comparison must be evaluated considering the substantial distinctions in sensor choice, data collection conditions, and data volume between the two studies.

The outcomes displayed within the RViz software are related to the specific rostopics (related to the ROS platform) employed in this simulation. The utilized

rostopics name: IMU is to visualize MPU6050 sensor data,

rostopic name: Map visualizes LiDAR data in the form of mapping using the Hector-SLAM algorithm,

rostopic name: odometry and PoseWithCovariant visualizes the EKF estimation, and

rostopic name: TF is to visualize vehicle frames.

Furthermore, apart from the performance data visualized in two dimensions and presented in Table 9 and Figure 7, this study also obtained image results by utilizing the RViz feature for mapping and localization. In Figure 8, the red arrows show the 2D Pose Estimate, which will produce sensor fusion data using the EKF. From this, it is evident that the robot can create maps by capturing distinctive elements of the physical surroundings through a LiDAR sensor, simultaneously achieving accurate localization.

Figure 8:

RViz visualization.

Discussion

According to the mapping results in Figure 6, obtained maximum and minimum distance measurements are with error margins of 4% and 67%, respectively. Furthermore, IMU testing and data gathering encompassed the gyroscope and accelerometer sensors assessment. The collected test data was then compared to the provided reference, yielding error data concerning the position reference. The gyroscope testing results indicated a 1% error for position X and a 2% error for position Y. Additionally, the accelerometer testing results revealed a 4% error for position X and a 2% error for position Y. In addition, the following component encoder reading outcomes require testing. The encoder testing results showed a 3% error for position X and a 2% error for position Y.

The acquired data was fused using the EKF to estimate position. Position estimation yielded errors of 1% for X and 3% for Y. These results were further evaluated using RMSE, resulting in values of 0.11 for X and 0.15 for Y. Compared to prior research by Kim and Park [29], their experimental results outperformed with RMSE values of 0.38 for X and 0.27 for Y, indicating superior outcomes obtained from our experiments. However, it's important to emphasize that these two inquiries significantly differ in their sensor choices, data collection settings, and data volume. In summary, although our suggested research shows a lower RMSE, it's essential to assess this comparison considering the substantial differences in sensor selection, data collection circumstances, and data quantity between the two studies.

To enhancing the accuracy of distance measurements, it is recommended to investigate and implement techniques that address the wide range of errors observed in mapping. Additionally, while gyroscope and accelerometer sensors showed promise, further calibration and testing may be necessary for precise navigation. For encoder readings, error mitigation strategies should be explored. The EKF can be optimized to improve position estimation accuracy. Lastly, when comparing more with other studies, it's crucial to account for differences in sensor selection, data collection conditions, and data volume.

Future research should focus on refining and optimizing distance measurement techniques to reduce errors. Further exploration of sensor calibration methods can enhance the reliability of gyroscope and accelerometer data. Investigation of advanced error correction mechanisms for encoder readings can contribute to more accurate position estimation. Additionally, research efforts can be directed toward improving the EKF algorithm for better data fusion. Comparative studies should be conducted meticulously considering experimental variables to provide more meaningful insights into autonomous navigation technology.

Conclusions

For over two decades, SLAM has received significant attention in the mobile robotics community. This study integrates the EKF with the SLAM framework, successfully improving autonomous vehicles' mapping and localization performance. This study's real-world and measurement scenario was executed through experiments in the ECS laboratory at the Department of Engineering Physics, Sepuluh Nopember Institute of Technology. The systematic fusion of data from the MPU6050 sensor and encoders through the EKF yields a harmonized dataset capturing the vehicle's precise coordinates along the X and Y axes. The fusion of perception and mapping technologies facilitated by the EKF within the SLAM framework emerges as a transformative tool, promising precision and accuracy in complex real-world scenarios. As the journey toward fully autonomous vehicles continues, the insights from this study contribute substantively to the discourse, illuminating a pathway where autonomous vehicles can navigate the urban landscape with confidence, accuracy, and adaptability.

eISSN:
1178-5608
Idioma:
Inglés
Calendario de la edición:
Volume Open
Temas de la revista:
Engineering, Introductions and Overviews, other