Skip to main content

Poolnoodle-Mixup

poolnoodle_bignoodle_cropped.pngPoolnoodle-BigNoodle is our biggest standard transformer model, featuring about 75 billion parameters. It's a general purpose model, trained in many languages and many generic capabilities.

Poolnoodle-BigNoodle's strong points are general reasoning and a deep general knowledge. With this, the model is capable of making complex decisions.

Poolnoodle-BigNoodle is a model that is trained to be helpful and explanative, with the appropriate instructions, it's a great model for an AI chatbot. 

With a standard context length of 8192 tokens without context-extension measures, it's capable of holding pretty long conversations and can parse considerable amount of text.

With 75 billion parameters, it's not the most lean model and as such it's not the fastest. Use Poolnoodle-BigNoodle when you need deep reasoning and deep insights.

Model Faceplate

Model name Poolnoodle-BigNoodle
Family Poolnoodle
Model Base UUID 2b1a1c6a-88c4-4db5-b08b-30ec5e8e2098
Parameters Circa 75 billion
Origin Llama 2
License LLama 2 license and ScaiLabs Model License
Context Length 8192
RoPE Scaling Supported Yes, dynamic
Tokenizer scaitoken-1
Embeddings model scailar-1
Runtime compatibility HF Transformers, FastChat, ScaiBlossom

Model Versions

Version Version String Release date UUID
0.91 poolnoodle-0.91-bignoodle 2023-09-12 fa1c2c2f-65be-4937-a58d-9c874a74a993
0.92 poolnoodle-0.92-bignoodle 2023-10-13 3418ce45-0fc0-43ac-8625-e10e704f197d
0.93 poolnoodle-0.93-bignoodle 2023-11-23 b279b437-77e6-4ccf-a130-059ebc12189b
0.94 poolnoodle-0.94-bignoodle 2023-12-04 0017db83-339f-4199-8d59-2f6fbd8e8240
0.95 poolnoodle-0.95-bignoodle 2024-02-03 c2942dd5-7f25-4edd-a675-2e9d64fe66e0