This is our senior project where we explore the development of a maritime vessel detection system using a network of marine radars simulated through Unity. The objective is to enhance maritime surveillance capabilities in the UAE by addressing limitations in traditional radar systems such as range and coverage constraints and adding an extra data layer for maritime surveillance. By integrating deep learning with a network of buoy-mounted radars, this project aims to provide un-manned, real-time maritime situational awareness. We propose a cost-effective, scalable solution that utilizes a simulated environment to train and test radars and marine vessels detection, which is a proof of concept that, if implemented in real-life, would improve national security and economic stability in maritime domains. Our trained model, CenterNet, achieved an F1-score of 0.938. Additionally, our network of radars sends the detected ships' locations to the database, which are then retrieved for visualization. For detailed technical information, refer to the accompanying Project's Report.pdf
.
- Introduction
- Project's Subsystems
- Installation
- Usage
- Project Structure
- Configuration Files
- Future Work
- Senior Project Team
This block diagram shows the various subsystems of our project. The user begins the simulation by inputting the simulation settings, radar settings, simulation scenario, and radar locations. The simulation system creates the needed radar and vessels according to the settings and scenario provided and stores the radar locations in the database. The simulated radar will detect the simulated vessels and export its output as a PPI image, which is communicated to the onboard software and stored in the database. The radar onboard software detects the vessels in the PPI images, converts the vessel locations from the relative position of the radar to the absolute position on a map and stores the vessel locations in the database. These vessel locations are then sent to the visualization platform for plotting onto a map, where the user can input specific map parameters such as scale and orientation.
- Unity: Version 2022.3.40f1
- Docker
- Python
- Conda
-
Clone the Repository:
git clone https://github.com/yal77/RadarSimulation.git
-
Navigate to the Cloned Directory:
cd RadarSimulation
-
Create and Activate Conda Environment:
conda create --name deep_learning python conda activate deep_learning
-
Install Dependencies:
pip install -r requirements.txt
-
Set Up Unity:
- Add project from disk.
- Choose
RadarSimulation -> RadarProject
. - Use version
2022.3.40f1
.
-
Set Up Visualization Platform:
cd Visualization npm install
This project can be used to:
- Generate a dataset of radar PPI images.
- Train deep learning models for vessel detection.
- Run the entire system (Simulation System, Onboard Software, and Visualization Platform) to simulate the Khorfakkan scene, predict vessel locations, and visualize them in the web application. Follow the steps below to generate your own dataset or use the pre-generated dataset available in this repository.
-
Create a Configuration File: Create a YAML file specifying the desired simulation parameters (refer to the table below). You can use the example file
sim-config-example.yaml
as a template, but ensure that thesceneName
is set toOceanMain
when generating a dataset. -
Run the Dataset Generation Script:
python ML/datasetGen/generateDataset.py path/to/config.yaml path/to/unity/executable path/to/output/directory
- Replace
path/to/config.yaml
with the path to your configuration file. - Replace
path/to/unity/executable
with the path to the Unity build executable of the project (you need to create this executable). - Replace
path/to/output/directory
with the directory where the dataset will be saved.
Parameter | Description |
---|---|
sceneName | Scene to start simulation in ("OceanMain" or "KhorfakkanCoastline") |
nships | Number of ships in a scenario defined by a range [minAmount, maxAmount] |
nLocations | Number of locations a ship visits during a scenario defined by a range [minAmount, maxAmount]) |
coordinateSquareWidth | Width of ship generation space |
speed | Ship movement speed (in knots) defined by a range [minSpeed, maxSpeed] |
radarRows | Number of rows in radar lattice network |
radarPower | Power transmitted in W |
radarGain | Gain of the radar in dB |
waveLength | Wavelength of radar in m |
radarImageRadius | Width of the output data array (pixels) |
antennaVerticalBeamWidth | Vertical angle of radar beam |
antennaHorizontalBeamWidth | Horizontal angle of radar beam |
rainRCS | RCS value for a rain drop |
nRadars | Number of radars in a scenario |
nScenarios | Number of scenarios to generate |
scenarioTimeLimit | Time limit for a scenario before ending and moving to next |
weather | List of Weather conditions to cycle through for each scenario |
waves | List of Wave conditions to cycle through for each scenario |
proceduralLand | List of bool to cycle through (whether procedural land is generated or not) |
generateDataset | Flag to generate a dataset |
unityBuildDirectory | Directory for Unity build |
outputDirectory | Directory for output files |
We implemented two models that you can train, CenterNet and YOLO from ultraytics.
-
Train CenterNet:
- Change
json_directory
to your dataset's location inML/CenterNet/main.py
and runmain.py
.
- Change
-
Train YOLO:
- Change the dataset path directory in
ppi_dataset.yaml
. This is the directory with the images the model will train on. - Modify
json_directory
andsave_directory
inML/yolo/train.py
(save_directory
should match the dataset directory inppi_dataset.yaml
) and runtrain.py
. - The training script will convert the JSON files into the format expected by YOLO, place them in the dataset directory, and train the model.
- Change the dataset path directory in
- Change
sim-config-example.yaml
(or your version of it) to have useKhorfakkanCoastline
scene. - Add your Conda path to
possible_paths
instart_services.py
. - Ensure Docker is running.
- Create a copy of
service_config-example.yaml
with your own paths. - Run
python run.py sim-config-example.yaml
. - Run
python start_services.py service_config-example.yaml
.
Below is an overview of the key folders and their purposes:
Contains machine learning models and scripts for dataset generation, training, and inference.
CenterNet/
: Implements the CenterNet model for vessel detection.datasetGen/
: Script for generating datasets.yolo/
: YOLO-based model implementation.
Handles radar image processing and vessel detection onboard.
- Key scripts:
centernet-infer.py
: Performs inference using the CenterNet model.onboard-yolo.py
: Handles onboard YOLO model operations.yolo_infer.py
: Inference script for YOLO.
Unity project for radar simulation.
Assets/
:Materials/
: Contains material configurations for land and ocean.Models/
: Includes 3D models for ships, buoys, and other objects.Oceans/
: Ocean generation and environmental settings.Scenes/
: Unity scenes for simulation:KhorfakkanCoastline.unity
: Scene representing the Khorfakkan coastline used for testing.OceanMain.unity
: Randomized scene used for training.
Scripts/
: Contains Unity C# scripts for simulation behavior.Buoyancy/
: Implements ship buoyancy physics by simulating interactions with water surfaces.Camera/
: Manages camera control and perspective for navigating the simulation environment.Khorfakkan Coastline/
: Generates and manages terrain data for the Khorfakkan Coastline simulation scene.Procedural Land Generation/
: Creates dynamic, procedurally generated terrain for simulation scenarios. Implemented following parts of Sebastian Lague's tutorial.Radar/
: Handles core radar operations.Scenario/
: Manages simulation scenarios, including settings, configurations, and data import/export.Shaders/
: Defines GPU-accelerated operations for radar data processing and visualization.Ship Movement/
: Controls ship movements and manages their positions within the simulation.Weather & Waves/
: Simulates environmental conditions such as weather and ocean wave dynamics.
Web-based visualization platform for plotting detected vessels on a map.
DB_API/
: Backend API for database interactions.src/
: Frontend source code for the visualization interface.
Here is a brief overview of each configuration file role:
ppi_dataset.yaml
: Defines the dataset structure for training the YOLO model, specifying the paths to training and validation data along with class labels (e.g., ship). It ensures the model correctly locates and processes the data during training.sim-config-example.yaml
: A template for defining the simulation configuration parameters mentioned in the earlier table allowing users to customize scenarios for generating radar PPI datasets or testing.service_config-example.yaml
: Defines the paths, environment, and settings required to initialize and manage the various components of the entire system, including the database, API, onboard software, and visualization platform.
- Enhance system security with encrypted data transmission and storage.
- Address the limitation of reflectivity by integrating material-specific radar reflections instead of treating all materials uniformly.
- Expand the simulation system to include more diverse weather scenarios.
- Focus on the following aspects for real-world deployment:
- Hardware integration.
- Power solutions.
- Reducing radar interference.
- Ensuring long-term system durability.
This project was done by arcarum, Yousif Alhosani, Mohammad Yaser Azrak and Ibrahim Baig.