What is your machine spec(CPU and GPU) for NUMERAI?

I am just curious about it.

GPU : gtx 1080
CPU : Ryzen 3700 8 core
RAM : 128gb

I’m probably going to get an rtx 4090 when those come out

1 Like

Thank you very much.
Similar spec as mine(5900HX, RTX2080, 64GBRAM).
I am also considering to upgrade it.

11th Generation Intel Core i9-11900K @3.50GHz, 16 Logical processors
64 GB memory
NVidia GeForce RTX 3070

I don’t use the RTX 3070 processor much for Numerai, as most of my processing is non-linear.

1 Like

Thank you very much.
Also I mainly use CPU now.

1 Like

CPU: Intel i7-9800X, 3.80GHz 16 cores
GPU: RTX 3090
RAM: 128 GB

1 Like

CPU: Intel i9-7900X (10 core, @3.3 GHz)
GPU: NVIDIA Geforce RTX 3090
RAM: 64 GB
built in early 2020; GPU upgrade in 2022; RAM upgrade in 2021 (32->64).

training and predicting neural networks on the GPU, xgboost training on GPU but prediction on CPU (due to too high memory usage even for the 3090).
Was first hesitant with the 3090 because of price, now I’m very happy with it, mainly due to the fast iteration on modeling ideas. Came from a GTX 1060 6 GB.

Besides Numerai I use the workstation for trying out other heavy compute things like simulations, or deep learning generated images like from Stable Diffusion model.

1 Like

i7-6800k, gtx 1070, 128 GB of ram, recently upgraded from 64 earlier this year!

1 Like

Currently 64 GB is enough for me. The reason you increased the RAM is 64GB is not enough for you?

Yeah I probably could’ve been smarter about some things but yeah I ran out of ram when I was modeling with XGBoost and I’m too lazy to get it working with less memory.

2 Likes

I currently use Google Colab Pro+ which has some sort of Intel Xeon, 50GB RAM, and an Nvidia Tesla V100. Occasionally I get lucky and score an A100 instance.

Yes, XGBoost is very greedy with memory, especially on Windows. It can easily run out of memory even with 64GB on the full v4 dataset.

For some reason, XGBoost behaves better under Linux with less “out of memory” scenarios. I am not sure why.

In my experience, LightGBM can handle twice as much data as XGBoost using the same amount of RAM.

EDIT: Google is now enforcing a credit system with Colab, can’t use their high end GPUs nonstop anymore :frowning: Probably will shift model training to my desktop, which is Intel 8700k, 64 GB RAM, Nvidia 3090.

1 Like

Yeah, but you can’t say XGBoooOOOOOOSTTT with LightGBM :confused:

1 Like

The dataset will probably get bigger in the future and even 128 GB won’t be enough. Hopefully the price of NMR 10x before that so i can buy a machine with even more memory.

4 Likes

@dzheng1887
By the way, why do you use XGBoost? You do not use LightGBM?

CPU: AMD TR 1920x
GPU: RTX 3090 x2
RAM: 128GB

I am running my models for free :crazy_face: in Kaggle notebooks CPU runtime mostly XGBoost or LGBM without GPU or TPU.
But @nyuton here explains how you can train your model on full V4 dataset with only 8GB RAM:

3 Likes

3060ti (8GB) + intel i9-10980XE + 64GB ram. Heavily leaning on Intel MKL for a lot of the calculations, and only sending the NN training off to the GPU. A future GPU with a larger memory might allow me to stay more on the GPU, but I wasn’t going to pay more than what they asked for a 3060ti back when I bought one :-).

1 Like

No reason in particular, it wins a lot of kaggle competitions and it’s sort of a running gag for me now to act silly and push data through XGBoost to solve all my problems

That said, I learned recently that you can generally think of boosting algorithms as a non parametric approach to estimation, so good reason why it does well generally. Combining a non-parametric approach with true structural assumptions in how the data behaves is always more optimal though, but it requires more work and thinking about the actual underlying process than just getting predictions out of xgboost

1 Like

Primary machine:
CPU: 12 core Intel i9 10920X @ 3.5Ghz
RAM: 256GB
GPU: 2x 3090 with 24GB of GPU memory on each card

Older secondary machine:
CPU: Quad core Intel i7-7700k @ 4.2Ghz
RAM: 64GB
GPU: 2x 2080TI with 11GB of GPU memory on each card

Both machines are headless.

3 Likes

Hi JRB,

I have 2x 3090 as well, blower style 2-slot so room for 2 more if my PSU can handle it (at least 1 would require a riser cable).

Would linking them with an NVLINK help at all on the Numerai data you reckon? Or not really? I’m mostly looking for an easy way to pool the ram together for a total of 48GB.