MLP hyperparameter tuning starter

Hi all,

I share my kaggle notebook to build a simple neural network (multi-layer perceptron) to fit the Numerai Tournament data.

[numerai] MLP with KerasTuner Starter

Since for beginners, hyperparameters in a NN are hard to tune, I also demonstrate how to use the KerasTuner to automatically fine-tune the hyperparameters of a NN in the same notebook.

Hopefully this notebook can help someone who gets bored with an integration_test-like model and is willing to try out NN.

10 Likes

Thank you for your contribution. I upvoted the notebook. Hope that helps for my first sumission in Numerai.

By the way, how did you do in the “Jane Street Market Prediction” competition? Could you compare and contrast the two competitions so that I would understand what Numerai is like more easily?

Thank you very much

I will share my approach in the JaneStreet competition in Kaggle, not here, only if I will be victorious in the end. Otherwise there is no point sharing my approach for anyone.

Numerai is simply easier to work on than the JaneStreet, as there is no TimeSeriesAPI complication. Also there is quite a bit of resources for new starters, so you might want to have a look.

https://docs.numer.ai/tournament/new-users

Thank you for your advice.