Skip to content

Latest commit

 

History

History
68 lines (46 loc) · 3.97 KB

File metadata and controls

68 lines (46 loc) · 3.97 KB

Tutorial on Training and Testing Baselines on V2X-Real dataset

Tutorials on Intermediate Fusion Training on V2X-Real dataset

Since V2X-Real utilizes multi-class predictions, the exact commands would be slightly different from running on OPV2V and DAIR-V2X. These training and testing instructions apply to all end-to-end training methods. Note that we adopt HEAL as the codebase structure and currently we only feature collaboration base training.

Train the model

We uses yaml file to configure all the parameters for training. To train your own model from scratch or a continued checkpoint, run the following commonds:

python opencood/tools/train.py -y ${CONFIG_FILE} [--model_dir ${CHECKPOINT_FOLDER}]

Arguments Explanation:

  • -y or hypes_yaml : the path of the training configuration file, e.g. opencood/hypes_yaml/opv2v/LiDAROnly/lidar_fcooper.yaml, meaning you want to train a FCooper model. We elaborate each entry of the yaml in the exemplar config file opencood/hypes_yaml/exemplar.yaml.
  • model_dir (optional) : the path of the checkpoints. This is used to fine-tune or continue-training. When the model_dir is given, the trainer will discard the hypes_yaml and load the config.yaml in the checkpoint folder. In this case, ${CONFIG_FILE} can be None,

Train the model in DDP

CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch  --nproc_per_node=2 --use_env opencood/tools/train_ddp.py -y ${CONFIG_FILE} [--model_dir ${CHECKPOINT_FOLDER}]

--nproc_per_node indicate the GPU number you will use.

Test the model

python opencood/tools/inference_mc.py --model_dir ${CHECKPOINT_FOLDER} [--fusion_method intermediate]
  • inference_mc.py has more optional args, you can inspect into this file.
  • [--fusion_method intermediate] the default fusion method is intermediate fusion. According to your fusion strategy in training, available fusion_method can be:
    • single: only ego agent's detection, only ego's gt box. [only for late fusion dataset]
    • no: only ego agent's detection, all agents' fused gt box. [only for late fusion dataset]
    • late: late fusion detection from all agents, all agents' fused gt box. [only for late fusion dataset]
    • early: early fusion detection from all agents, all agents' fused gt box. [only for early fusion dataset]
    • intermediate: intermediate fusion detection from all agents, all agents' fused gt box. [only for intermediate fusion dataset]

Notes:

  • You could refer to /scripts folder to for example running scripts. mc stands for multi-class, which differentiates itself from single-class training and inference. /scripts/inference_mc/inference_mc_fp.sh refers to full-precision inference that is differentiated with /scripts/inference_mc/inference_mc_quant.sh which involves a post-training quantization (PTQ) stage.

Tutorials on Early/Late Fusion Training on V2X-Real dataset

Early fusion involves fusing only raw LiDAR point cloud data from neighboring agents to create a more holistic view of the enviornment, leading to better predictions. Late fusion involves receiving independent 3D detections (bounding boxes) from neighboring agents to produce consistent and more accurate predictions.

Stage 1: Train the full-precision model

We uses yaml file to configure all the parameters for training. To train your own model from scratch or a continued checkpoint, run the following commonds:

python ./opencood/tools/train.py -y ./opencood/hypes_yaml/v2x_real/LiDAROnly/lidar_[early/late]_mc_fusion.yaml

Test the model

python opencood/tools/inference_mc.py --model_dir ${CHECKPOINT_FOLDER} [--fusion_method early/late]

Notes:

  • You could also run single class early/late fusion with yaml files in the ./opencood/hypes_yaml/dairv2x/LiDAROnly folder under lidar_early_fusion.yaml and lidar_late_fusion.yaml respectively. You will need to test the model with inference.py as opposed to inference_mc.py however.