It is an effective means for merchants to carry out precision marketing and improve ROI by using historical user behavior data obtained from promotional activities in order to build a model to predict the repeat purchase behavior of users after promotional activities. Most of the existing prediction models are supervised learning, which does not work well with a small amount of labeled data. This paper proposes a BERT-MLP prediction model that uses “large-scale data unsupervised pre-training + small amount of labeled data fine-tuning.” The experimental results on Alibaba real dataset show that the accuracy of the BERT-MLP model is better than the baseline model.
Dong Y, Jiang W, 2019, Brand Purchase Prediction Based on Time-Evolving User Behaviors in E-Commerce. Communications in Nonlinear Science and Numerical Simulation, 31(1): e4882.
Tsur O, Rappoport A, 2012, Proceedings of the Fifth ACM International Conference on Web Search and Data Mining, February 8-12, 2012: What’s in a Hashtag? Content Based Prediction of the Spread of Ideas in Microblogging Communities. Association for Computing Machinery, New York, NY, United States, 643-652.
Song HS, 2017, Comparison of Performance Between MLP and RNN Model to Predict Purchase Timing for Repurchase Product. Journal of Information Technology Applications and Management, 24(1): 111-128.
Liu G, Nguyen T, Zhao G, et al., 2016, Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, August 13-17, 2016: Repeat Buyer Prediction for E-Commerce. Association for Computing Machinery, New York, NY, United States, 155-164.
Hu W, Shi Y, 2020, Proceedings of the 2020 5th International Conference on Communication, Image and Signal Processing, November 13-15, 2020: Prediction of Online Consumers’ Buying Behavior Based on LSTM-RF Model, IEEE, 224-228.
Sun C, Qiu X, Xu Y, et al., 2019, Proceedings of the 18th China National Conference, October 18-20, 2019: How to Fine-Tune Bert for Text Classification?. Springer-Verlag, Berlin, Heidelberg, 194-206.
Tolstikhin I, Houlsby N, Kolesnikov A, et al., 2021, MLP-Mixer: An All-Mlp Architecture for Vision. arXiv, 2105.01601 (preprint).
Qiu X, Sun T, Xu Y, et al., 2020, Pre-Trained Models for Natural Language Processing: A Survey. Science China Technological Sciences, 63: 1872-1897.
Dai Z, Lai G, Yang Y, et al., 2020, Funnel-Transformer: Filtering Out Sequential Redundancy for Efficient Language Processing. arXiv, 2006.03236 (preprint).
Alibaba Cloud Tianchi, 2021, Repeat Buyers Prediction-Challenge the Baseline. Alibaba Cloud. https://tianchi.aliyun.com/competition/entrance/231576/information.