145
GPT-4's details are leaked. (threadreaderapp.com)

cross-posted from: https://lemmy.intai.tech/post/72919

Parameters count:

GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters across 120 layers. Mixture Of Experts - Confirmed.

OpenAI was able to keep costs reasonable by utilizing a mixture of experts (MoE) model. They utilizes 16 experts within their model, each is about ~111B parameters for MLP. 2 of these experts are routed to per forward pass.

Related Article: https://lemmy.intai.tech/post/72922

you are viewing a single comment's thread
view the rest of the comments
[-] shotgun_crab@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Oh I see. Thanks, it's interesting stuff

this post was submitted on 11 Jul 2023
145 points (97.4% liked)

ChatGPT

8909 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS