Virtual influencers are 100% Computer Generated Influencers created by AI and 3D artists. These virtual influences can model, sing, and even interact with fans—without ever existing in real life. Despite the risks of virtual influencers (e.g., lack of authenticity, ethical and transparency concerns, etc.), they represent enormous opportunities for brands.

- 00:03:47
Recently, ensemble-based machine learning models have been widely adopted and have demonstrated their effectiveness in bankruptcy prediction. However, these algorithms often function as black boxes, making it difficult to understand how they generate forecasts. This lack of transparency has led to growing interest in interpretability methods within artificial intelligence research.
In this paper, we assess the predictive performance of Random Forest, LightGBM, XGBoost, and NGBoost (Natural Gradient Boosting for probabilistic prediction) on French firms across various industries, with a forecasting horizon of one to five years. We then apply Shapley Additive Explanations (SHAP), a model-agnostic interpretability technique, to explain XGBoost, one of the best-performing models in our study. SHAP highlights the contribution of each feature to the model’s predictions, enabling a clearer understanding of how financial and macroeconomic factors influence bankruptcy risk. Moreover, it allows for the explanation of individual predictions, making black-box models more applicable in credit risk management.
NGUYEN Hoang Hiep - EM Normandie |
- Research
- Corporate and Market Finance