Follow
Furu Wei
Furu Wei
Partner Research Manager, Microsoft Research
Verified email at microsoft.com - Homepage
Title
Cited by
Cited by
Year
Beit: Bert pre-training of image transformers
H Bao, L Dong, S Piao, F Wei
arXiv preprint arXiv:2106.08254, 2021
29462021
Oscar: Object-Semantics Aligned Pre-training for Vision-Language Tasks
X Li, X Yin, C Li, P Zhang, X Hu, L Zhang, L Wang, H Hu, L Dong, F Wei, ...
Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23 …, 2020
21522020
Swin transformer v2: Scaling up capacity and resolution
Z Liu, H Hu, Y Lin, Z Yao, Z Xie, Y Wei, J Ning, Y Cao, Z Zhang, L Dong, ...
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2022
19442022
Vl-bert: Pre-training of generic visual-linguistic representations
W Su, X Zhu, Y Cao, B Li, L Lu, F Wei, J Dai
arXiv preprint arXiv:1908.08530, 2019
19082019
Unified language model pre-training for natural language understanding and generation
L Dong, N Yang, W Wang, F Wei, X Liu, Y Wang, J Gao, M Zhou, HW Hon
33rd Conference on Neural Information Processing Systems (NeurIPS 2019), 2019
18332019
Wavlm: Large-scale self-supervised pre-training for full stack speech processing
S Chen, C Wang, Z Chen, Y Wu, S Liu, Z Chen, J Li, N Kanda, T Yoshioka, ...
IEEE Journal of Selected Topics in Signal Processing 16 (6), 1505-1518, 2022
16752022
Learning sentiment-specific word embedding for twitter sentiment classification
D Tang, F Wei, N Yang, M Zhou, T Liu, B Qin
Proceedings of the 52nd Annual Meeting of the Association for Computational …, 2014
16302014
Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
L Dong, F Wei, C Tan, D TangΤ, M Zhou, K Xu
ACL, 2014
12402014
Minilm: Deep self-attention distillation for task-agnostic compression of pre-trained transformers
W Wang, F Wei, L Dong, H Bao, N Yang, M Zhou
Advances in Neural Information Processing Systems 33, 5776-5788, 2020
11792020
Gated self-matching networks for reading comprehension and question answering
W Wang, N Yang, F Wei, B Chang, M Zhou
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
8582017
HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization
X Zhang, F Wei, M Zhou
ACL, 2019
838*2019
Layoutlm: Pre-training of text and layout for document image understanding
Y Xu, M Li, L Cui, S Huang, F Wei, M Zhou
Proceedings of the 26th ACM SIGKDD international conference on knowledge …, 2020
8102020
Image as a foreign language: Beit pretraining for all vision and vision-language tasks
W Wang, H Bao, L Dong, J Bjorck, Z Peng, Q Liu, K Aggarwal, ...
arXiv preprint arXiv:2208.10442, 2022
753*2022
Recognizing named entities in tweets
X Liu, S Zhang, F Wei, M Zhou
Proceedings of the 49th annual meeting of the association for computational …, 2011
6482011
Topic sentiment analysis in twitter: a graph-based hashtag sentiment classification approach
X Wang, F Wei, X Liu, M Zhou, M Zhang
Proceedings of the 20th ACM international conference on Information and …, 2011
6362011
Question answering over freebase with multi-column convolutional neural networks
L Dong, F Wei, M Zhou, K Xu
Proceedings of the 53rd Annual Meeting of the Association for Computational …, 2015
5902015
Kosmos-2: Grounding multimodal large language models to the world
Z Peng, W Wang, L Dong, Y Hao, S Huang, S Ma, F Wei
arXiv preprint arXiv:2306.14824, 2023
5822023
Neural codec language models are zero-shot text to speech synthesizers
C Wang, S Chen, Y Wu, Z Zhang, L Zhou, S Liu, Z Chen, Y Liu, H Wang, ...
arXiv preprint arXiv:2301.02111, 2023
5612023
Layoutlmv2: Multi-modal pre-training for visually-rich document understanding
Y Xu, Y Xu, T Lv, L Cui, F Wei, G Wang, Y Lu, D Florencio, C Zhang, ...
arXiv preprint arXiv:2012.14740, 2020
5292020
Superagent: A customer service chatbot for e-commerce websites
L Cui, S Huang, F Wei, C Tan, C Duan, M Zhou
Proceedings of ACL 2017, system demonstrations, 97-102, 2017
4952017
The system can't perform the operation now. Try again later.
Articles 1–20