Li Bo
Li Bo
Other namesBrian Bo Li
PhD Student, NTU, Singapore
Verified email at - Homepage
Cited by
Cited by
Multi-source domain adaptation for semantic segmentation
S Zhao, B Li, X Yue, Y Gu, P Xu, R Hu, H Chai, K Keutzer
NeurIPS 2019, 2019
A review of single-source deep unsupervised visual domain adaptation
S Zhao, X Yue, S Zhang, B Li, H Zhao, B Wu, R Krishna, JE Gonzalez, ...
IEEE Transactions on Neural Networks and Learning Systems, 2020
Multi-source domain adaptation in the deep learning era: A systematic survey
S Zhao, B Li, P Xu, K Keutzer
arXiv preprint arXiv:2002.12169, 2020
Learning Invariant Representations and Risks for Semi-supervised Domain Adaptation
B Li, Y Wang, S Zhang, D Li, T Darrell, K Keutzer, H Zhao
CVPR 2021, 2020
Self-Supervised Pretraining Improves Self-Supervised Pretraining
CJ Reed, X Yue, A Nrusimha, S Ebrahimi, V Vijaykumar, R Mao, B Li, ...
WACV 2022, 2021
MADAN: multi-source adversarial domain aggregation network for domain adaptation
S Zhao, B Li, P Xu, X Yue, G Ding, K Keutzer
International Journal of Computer Vision 129 (8), 2399-2424, 2021
ePointDA: An End-to-End Simulation-to-Real Domain Adaptation Framework for LiDAR Point Cloud Segmentation
S Zhao, Y Wang, B Li, B Wu, Y Gao, P Xu, T Darrell, K Keutzer
AAAI 2021, 2020
Rethinking distributional matching based domain adaptation
B Li, Y Wang, T Che, S Zhang, S Zhao, P Xu, W Zhou, Y Bengio, ...
arXiv preprint arXiv:2006.13352, 2020
Invariant Information Bottleneck for Domain Generalization
B Li, Y Shen, Y Wang, W Zhu, CJ Reed, D Li, K Keutzer, H Zhao
AAAI 2022, 2021
Energy-Based Open-World Uncertainty Modeling for Confidence Calibration
Y Wang, B Li, T Che, K Zhou, D Li, Z Liu
ICCV 2021, 2021
Domain generalization using pretrained models without fine-tuning
Z Li, K Ren, X Jiang, B Li, H Zhang, D Li
arXiv preprint arXiv:2203.04600, 2022
OpenOOD: Benchmarking Generalized Out-of-Distribution Detection
J Yang, P Wang, D Zou, Z Zhou, K Ding, W Peng, H Wang, G Chen, B Li, ...
NeurIPS 2022 Datasets and Benchmarks Track, 2022
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
B Li, X Jiang, D Bai, Y Zhang, N Zheng, X Dong, L Liu, Y Yang, D Li
arXiv preprint arXiv:2108.13465, 2021
Sparse Mixture-of-Experts are Domain Generalizable Learners
B Li, J Yang, J Ren, Y Wang, Z Liu
arXiv preprint arXiv:2206.04046, 2022
AnimeRun: 2D Animation Visual Correspondence from Open Source 3D Movies
L Siyao, Y Li, B Li, C Dong, Z Liu, CC Loy
NeurIPS 2022 Datasets and Benchmarks Track, 2022
Sparse Mixture-of-Experts are Domain Generalizable Learners
B Li, Y Shen, J Yang, Y Wang, J Ren, T Che, J Zhang, Z Liu
NeurIPS 2022 Workshop on Distribution Shifts: Connecting Methods and …, 2022
Your Autoregressive Generative Model Can be Better If You Treat It as an Energy-Based One
Y Wang, T Che, B Li, K Song, H Pei, Y Bengio, D Li
Preprint, 2022
The system can't perform the operation now. Try again later.
Articles 1–17