Department of Computer Engineering & A.I., Military Technical College, Cairo, Egypt.
10.1088/1742-6596/2616/1/012024
Abstract
Streak detection is important in space situational awareness and space asset protection. It is desirable to detect moving targets (e.g., satellites, space debris, or meteorites) in images of the sky. This paper presents a comparison between two astronomical frameworks for streak detection based on deep NNs. The first framework uses the extended feature pyramid network (EFPN) with faster region-based CNNs (Faster R-CNN) and compares it with the second framework that uses the feature pyramid network (FPN) with Faster R-CNN. Because there aren't enough publicly available astronomical data sets, we use the simulated data set to train the neural network. The metrics of mean average precision (mAP), recall, precision, and F1 score were used to measure the performance of the two frameworks. The experimental results confirmed that the EFPN-based framework achieves a significant improvement in streak detection than the Faster R-CNN framework based on the FPN model.
Elhakiem, A., Ghoniemy, T., & Salama, G. (2023). Streak detection in astronomical images based on convolutional neural network. International Conference on Aerospace Sciences and Aviation Technology, 20(20), 1-9. doi: 10.1088/1742-6596/2616/1/012024
MLA
A A Elhakiem; T S Ghoniemy; G I Salama. "Streak detection in astronomical images based on convolutional neural network", International Conference on Aerospace Sciences and Aviation Technology, 20, 20, 2023, 1-9. doi: 10.1088/1742-6596/2616/1/012024
HARVARD
Elhakiem, A., Ghoniemy, T., Salama, G. (2023). 'Streak detection in astronomical images based on convolutional neural network', International Conference on Aerospace Sciences and Aviation Technology, 20(20), pp. 1-9. doi: 10.1088/1742-6596/2616/1/012024
VANCOUVER
Elhakiem, A., Ghoniemy, T., Salama, G. Streak detection in astronomical images based on convolutional neural network. International Conference on Aerospace Sciences and Aviation Technology, 2023; 20(20): 1-9. doi: 10.1088/1742-6596/2616/1/012024