|dc.description.abstract||Recent advances in Driver-Assisted Truck Platooning (DATP) have shown success in linking multiple trucks in leader-follower platoons using Cooperative Adaptive Cruise Control (CACC). Such setups allow for closer spacing between trucks which leads to fuel savings. Given that frontal collisions are the most common type of highway accident for heavy trucks, one key issue to truck platooning is handling situations in which vehicles cut-in between platooning trucks. Having more accurate and quicker predictions of cut-in behavior would improve the safety and efficiency of truck platooning by prompting the control system to react to the intruder sooner and allow for proper spacing before the cut-in occurs.
This thesis implements a deep neural network that generates multimodal predictions of traffic agents around a truck platoon in a simulated environment and culminates in testing on data obtained from the Auburn truck platoon. The method uses Long Short-Term Memory networks in an ensemble architecture to predict multiple possible future positions of vehicles passing by a truck platoon over a 5 second prediction horizon and classifies the potential vehicle behavior as ‘passing’ or ‘cut-in’ with prescribed certainties. The network performance is compared to a baseline of common state-based predictors including the Constant Velocity Predictor, the Constant Acceleration Predictor, and the Constant Turn Radius Predictor.
The Ensemble LSTM network is shown to be a promising predictor, outperforming state-based predictors over a 5 second prediction horizon with lower average and standard deviation of root mean squared error across 1000 test trajectories. The network is also shown to provide good predictions for a cut-in detector, which is able to accurately detect cut-in behavior on test trajectories with a balanced accuracy of 87.6 percent. Finally, the network is run on data collected from the Auburn truck platoon to demonstrate the viability of adapting the system to real world testing and development.||en_US