Skip to content
/ TDL-GNSS Public

A tightly coupled deep learning framework for GNSS

License

Notifications You must be signed in to change notification settings

ebhrz/TDL-GNSS

Repository files navigation

TDL-GNSS

News

2024.10.25 New dataset available! We opensource the complete KLT dataset with LOS/NLOS label and other sensor data. We want this dataset to be a benchmark for the deep learning aided tightly coupled GNSS. Please find and play at KLTDataset

Tightly Coupled Deep Learning and GNSS Integration Framework

This subsystem is built on pyrtklib and is designed to tightly integrate deep learning into the GNSS (Global Navigation Satellite System) processing workflow. You can access the preprint version of our paper here on arxiv. We would greatly appreciate it if you could cite our work:

@misc{hu2024pyrtklibopensourcepackagetightly,
      title={pyrtklib: An open-source package for tightly coupled deep learning and GNSS integration for positioning in urban canyons}, 
      author={Runzhi Hu and Penghui Xu and Yihan Zhong and Weisong Wen},
      year={2024},
      eprint={2409.12996},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2409.12996}, 
}

Features

Several useful functions are available in rtk_util.py, including:

  • Weighted Least Squares Solver: Implemented using NumPy and PyTorch, this function allows you to adjust the weight and pseudorange bias during the GNSS solution process.

Dataset

Now, we provide the whole dataset KLTDataset, which includes the previous content! Come and play!


We provide a dataset named KLT, which includes three subsets: KLT1, KLT2, and KLT3, divided by timestamp. Each subset includes 100Hz ground truth data generated by our ground truth collection platform. You can download the dataset from Dropbox. After downloading, extract the contents into a folder named data.

Sensor Kit

For more details on the sensor kit we use, please refer to the UrbanNav dataset. If you're interested in additional sensor data (e.g., camera, LiDAR, IMU) to build a multimodal architecture, you can download ROS bag files from this Dropbox folder.

Framework

Training

Training Process

This is the training process. In this example, we demonstrate a simple network architecture that utilizes C/N0, elevation angle, and residuals from the equal-weight least squares (WLS) solution. The network predicts the pseudorange and weight, which are then used in the WLS to compute the position.

The loss function is applied to the difference between the ground truth position and the predicted position, and this loss is backpropagated to train the network.

Prediction

Prediction Process

In the prediction process, just like during training, the output of the WLS is the final predicted position.

Install

  1. Clone the repo. git clone git@github.com:ebhrz/TDL-GNSS.git
  2. Install the requirements. pip install -r requirements.txt

Usage

This repository provides three types of network architectures:

  • Bias Network: Predicts only the pseudorange bias correction. Associated files:

  • bias_network_train.py

  • bias_network_predict.py

  • Weight Network: Predicts only the weight used in the WLS. Associated files:

  • weight_network_train.py

  • weight_network_predict.py

  • Hybrid Network: Predicts both the pseudorange bias and the weight. Associated files:

  • hybrid_network_train.py

  • hybrid_network_predict.py

Configuration

Before training a network, you need to create a configuration file in JSON format. Below are the available parameters:

  • obs: The RINEX observation file, either a single file name or a list of file names.
  • eph: The ephemeris files, which can also be a single file name or a list of file names.
  • gt: The ground truth file.
  • start_time: An integer indicating the start time of the RINEX file, aligned to GPS time (in timestamp format).
  • end_time: An integer indicating the end time of the RINEX file, aligned to GPS time (in timestamp format).
  • model: The directory where the trained model will be saved.
  • mode: Currently not used, but it may be implemented in the future for a unified interface.
  • epoch: Used for training to specify the number of epochs.

Examples of configuration files can be found in the config folder.

Training a Model

To train a network, use the following command:
python {type}_train.py config.json

Predicting with a Model

To use a model for prediction, use the following command:
python {type}_predict.py config.json
In both commands, {type} can be replaced with bias, weight, or hybrid depending on the architecture you are using.

Examples

Here are examples for training and predicting with the hybrid network:

python hybrid_network_train.py config/hybrid_share/klt3_train.json
python hybrid_network_predict.py config/hybrid_share/klt1_predict.json

Baseline

We also provide a baseline script to generate results using goGPS and RTKLIB. You can use any prediction configuration file with this script:

python3 baseline.py config/bias/klt1_predict.json

Training Tips

  1. Data Normalization: All networks will first normalize the data, which means the model is tied to the specific receiver used during training. For example, if you collect data and train the model using a u-blox F9P receiver, the model may not perform well on data collected from a different device, such as a smartphone.

  2. Initial Position Guess: When training, avoid setting the initial guess for the position to (0, 0, 0), as this can cause drastic changes in the H matrix, hence affacts the gradients. A solution derived from the equal weight least squares is more ideal and reasonable.

  3. Training Epochs for Hybrid Network: When training a hybrid network (bias and weight combined), use fewer epochs compared to training bias-only or weight-only networks. Training for too many epochs can lead to overfitting. For the KLT3 dataset, we recommend around 100 training epochs.

  4. Windows Attention: If you are using this on Windows system, please change the path separator "/" into "\" in the config file! For example:

    {
       "obs":["data\\0610_KLT\\GEOP161D.21o"],
       "eph":"data\\0610_KLT\\sta\\hksc161d.21*",
    }

Citation

If you find this tool useful, we would appreciate it if you cite our paper:

@misc{hu2024pyrtklibopensourcepackagetightly,
      title={pyrtklib: An open-source package for tightly coupled deep learning and GNSS integration for positioning in urban canyons}, 
      author={Runzhi Hu and Penghui Xu and Yihan Zhong and Weisong Wen},
      year={2024},
      eprint={2409.12996},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2409.12996}, 
}

Challenge

In the dataset, there is a file GEOP161D.21o collected using a Huawei P40 phone. So far, we have not been able to find the proper parameters to train a model that outperforms goGPS and RTKLIB on this data. Everyone is welcome to give it a try! :)

About

A tightly coupled deep learning framework for GNSS

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages