XGBoost vs LightGBM for alpha generation
Both XGBoost and LightGBM are popular choices. My experiments show:
- LightGBM: 2-3x faster training, similar accuracy, better with categorical features
- XGBoost: More robust to hyperparameter choices, slightly better on small datasets
14 Replies
Thanks for sharing! This is exactly the kind of insight that helps the community grow. Bookmarking this thread.
Can confirm this approach works. I implemented something similar and jumped from rank 150 to rank 23 in two weeks.
Have you considered using PCA to reduce the dimensionality of the feature space? I found that the first 10 components capture 80%+ of the variance.
For factor models, I'd strongly recommend the Fama-French 5-factor model as a starting point. It captures most systematic risk.
I think the platform should add a paper trading mode so we can test strategies in a more realistic setting between competitions.
Can confirm this approach works. I implemented something similar and jumped from rank 150 to rank 23 in two weeks.
Turnover control is crucial. My best performing model has a turnover of only 8% daily. High turnover strategies rarely survive transaction costs.
For those new to the platform: start with the tutorial competition. It has a smaller dataset and more forgiving scoring.
Has anyone tried using attention mechanisms for this? The temporal attention weights could tell you which historical periods are most relevant.
Interesting thread! I've been exploring reinforcement learning for portfolio allocation. The challenge is defining the right reward function.