Kaihong Huang (Graduated 2018)

PhD Student (Graduated 2018)
Contact:
Email: huang@nulligg.uni-bonn.de
Tel: +49 – 228 – 73 –
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15,
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn

Research Interests

  • Visual Mapping, SLAM
  • Computer Vision, Optimization
  • Control Theory
  • UAVs, Soccer Robot

Short CV

Kaihong Huang was a Ph.D student at the Photogrammetry & Robotics Lab of the University of Bonn until 2018.

Publications

2024

  • S. Pan, L. Jin, X. Huang, C. Stachniss, M. Popović, and M. Bennewitz, “Exploiting Priors from 3D Diffusion Models for RGB-Based One-Shot View Planning,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2024.
    [BibTeX] [PDF]
    @inproceedings{pan2024iros,
    author = {S. Pan and L. Jin and X. Huang and C. Stachniss and M. Popovi\'c and M. Bennewitz},
    title = {{Exploiting Priors from 3D Diffusion Models for RGB-Based One-Shot View Planning}},
    booktitle = iros,
    year = 2024,
    }

  • S. Pan, L. Jin, X. Huang, C. Stachniss, M. Popovic, and M. Bennewitz, “Exploiting Priors from 3D Diffusion Models for RGB-Based One-Shot View Planning,” in In proc. of the icra workshop on neural fields in robotics (robonerf), 2024.
    [BibTeX]
    @inproceedings{pan2024icraws,
    title={{Exploiting Priors from 3D Diffusion Models for {RGB}-Based One-Shot View Planning}},
    author={S. Pan and L. Jin and X. Huang and C. Stachniss and M. Popovic and M. Bennewitz},
    booktitle={In Proc. of the ICRA Workshop On Neural Fields In Robotics (RoboNerF)},
    year={2024},
    }

2021

  • C. Shi, X. Chen, K. Huang, J. Xiao, H. Lu, and C. Stachniss, “Keypoint Matching for Point Cloud Registration using Multiplex Dynamic Graph Attention Networks,” IEEE Robotics and Automation Letters (RA-L), vol. 6, pp. 8221-8228, 2021. doi:10.1109/LRA.2021.3097275
    [BibTeX] [PDF]
    @article{shi2021ral,
    title={{Keypoint Matching for Point Cloud Registration using Multiplex Dynamic Graph Attention Networks}},
    author={C. Shi and X. Chen and K. Huang and J. Xiao and H. Lu and C. Stachniss},
    year={2021},
    journal=ral,
    volume=6,
    issue=4,
    pages={8221-8228},
    doi = {10.1109/LRA.2021.3097275},
    issn = {2377-3766},
    }

2019

  • K. Huang, J. Xiao, and C. Stachniss, “Accurate Direct Visual-Laser Odometry with Explicit Occlusion Handling and Plane Detection,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2019.
    [BibTeX] [PDF]
    @InProceedings{huang2019icra,
    author = {K. Huang and J. Xiao and C. Stachniss},
    title = {{Accurate Direct Visual-Laser Odometry with Explicit Occlusion Handling and Plane Detection}},
    booktitle = icra,
    year = 2019,
    }

2018

  • K. H. Huang and C. Stachniss, “Joint Ego-motion Estimation Using a Laser Scanner and a Monocular Camera Through Relative Orientation Estimation and 1-DoF ICP,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2018.
    [BibTeX] [PDF] [Video]

    Pose estimation and mapping are key capabilities of most autonomous vehicles and thus a number of localization and SLAM algorithms have been developed in the past. Autonomous robots and cars are typically equipped with multiple sensors. Often, the sensor suite includes a camera and a laser range finder. In this paper, we consider the problem of incremental ego-motion estimation, using both, a monocular camera and a laser range finder jointly. We propose a new algorithm, that exploits the advantages of both sensors–-the ability of cameras to determine orientations well and the ability of laser range finders to estimate the scale and to directly obtain 3D point clouds. Our approach estimates the five degree of freedom relative orientation from image pairs through feature point correspondences and formulates the remaining scale estimation as a new variant of the iterative closet point problem with only one degree of freedom. We furthermore exploit the camera information in a new way to constrain the data association between laser point clouds. The experiments presented in this paper suggest that our approach is able to accurately estimate the ego-motion of a vehicle and that we obtain more accurate frame-to-frame alignments than with one sensor modality alone.

    @InProceedings{huang2018iros,
    author = {K.H. Huang and C. Stachniss},
    title = {{Joint Ego-motion Estimation Using a Laser Scanner and a Monocular Camera Through Relative Orientation Estimation and 1-DoF ICP}},
    booktitle = iros,
    year = 2018,
    videourl = {https://www.youtube.com/watch?v=Glv0UT_KqoM},
    abstract = {Pose estimation and mapping are key capabilities of most autonomous vehicles and thus a number of localization and SLAM algorithms have been developed in the past. Autonomous robots and cars are typically equipped with multiple sensors. Often, the sensor suite includes a camera and a laser range finder. In this paper, we consider the problem of incremental ego-motion estimation, using both, a monocular camera and a laser range finder jointly. We propose a new algorithm, that exploits the advantages of both sensors---the ability of cameras to determine orientations well and the ability of laser range finders to estimate the scale and to directly obtain 3D point clouds. Our approach estimates the five degree of freedom relative orientation from image pairs through feature point correspondences and formulates the remaining scale estimation as a new variant of the iterative closet point problem with only one degree of freedom. We furthermore exploit the camera information in a new way to constrain the data association between laser point clouds. The experiments presented in this paper suggest that our approach is able to accurately estimate the ego-motion of a vehicle and that we obtain more accurate frame-to-frame alignments than with one sensor modality alone.}
    }

  • K. H. Huang and C. Stachniss, “On geometric models and their accuracy for extrinsic sensor calibration,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2018.
    [BibTeX] [PDF]
    @InProceedings{huang2018icra,
    author = {K.H. Huang and C. Stachniss},
    title = {On Geometric Models and Their Accuracy for Extrinsic Sensor Calibration},
    booktitle = icra,
    year = 2018,
    url = {https://www.ipb.uni-bonn.de/pdfs/huang2018icra.pdf},
    }

2017

  • K. H. Huang and C. Stachniss, “Extrinsic multi-sensor calibration for mobile robots using the gauss-helmert model,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2017.
    [BibTeX] [PDF]
    @InProceedings{huang2017iros,
    author = {K.H. Huang and C. Stachniss},
    title = {Extrinsic Multi-Sensor Calibration For Mobile Robots Using the Gauss-Helmert Model},
    booktitle = iros,
    year = 2017,
    url = {https://www.ipb.uni-bonn.de/pdfs/huang2017iros.pdf},
    }