Skip to main content

Poolnoodle-Mixup

poolnoodle_bignoodle_cropped.pngPoolnoodle-BigNoodle is our biggest standard transformer model, featuring about 75 billion parameters. It's a general purpose model, trained in many languages and many generic capabilities.

Poolnoodle-BigNoodle's strong points are general reasoning and a deep general knowledge. With this, the model is capable of making complex decisions.

Poolnoodle-BigNoodle is a model that is trained to be helpful and explanative, with the appropriate instructions, it's a great model for an AI chatbot. 

With a standard context length of 8192 tokens without context-extension measures, it's capable of holding pretty long conversations and can parse considerable amount of text.

With 75 billion parameters, it's not the most lean model and as such it's not the fastest. Use Poolnoodle-BigNoodle when you need deep reasoning and deep insights.

Model Faceplate

Model namePoolnoodle-BigNoodle
FamilyPoolnoodle
Model Base UUID2b1a1c6a-88c4-4db5-b08b-30ec5e8e2098
ParametersCirca 75 billion
OriginLlama 2
LicenseLLama 2 license and ScaiLabs Model License
Context Length8192
RoPE Scaling SupportedYes, dynamic
Tokenizerscaitoken-1
Embeddings modelscailar-1
Runtime compatibilityHF Transformers, FastChat, ScaiBlossom

Model Versions

VersionVersion StringRelease dateUUID
0.91poolnoodle-0.91-bignoodle2023-09-12fa1c2c2f-65be-4937-a58d-9c874a74a993
0.92poolnoodle-0.92-bignoodle2023-10-133418ce45-0fc0-43ac-8625-e10e704f197d
0.93poolnoodle-0.93-bignoodle2023-11-23b279b437-77e6-4ccf-a130-059ebc12189b
0.94poolnoodle-0.94-bignoodle2023-12-040017db83-339f-4199-8d59-2f6fbd8e8240
0.95poolnoodle-0.95-bignoodle2024-02-03c2942dd5-7f25-4edd-a675-2e9d64fe66e0