Bug report: Submission results showing NaN for some metrics
After my latest submission, some performance metrics show NaN in the results dashboard. Specifically:
- Max Drawdown: NaN
- Calmar Ratio: NaN
Steps to reproduce:
7 Replies
Good point about overfitting. My rule of thumb: never trust a backtest with fewer than 500 observations in the out-of-sample period.
Anyone else noticing that momentum factors have been working particularly well in the last month of competition data?
Turnover control is crucial. My best performing model has a turnover of only 8% daily. High turnover strategies rarely survive transaction costs.
Thanks for sharing! This is exactly the kind of insight that helps the community grow. Bookmarking this thread.
The biggest mistake I see newcomers make: optimizing for the wrong metric. Sharpe != best trading strategy. Consider Calmar, Sortino, and max drawdown.
One more thing: the scoring engine uses a held-out test period that you never see. So your validation score is the best you can do.
The competition scoring docs could definitely be clearer. I spent 2 hours debugging what turned out to be a normalization issue.