Skip to main content

Poolnoodle-Mixup

poolnoodle_bignoodle_cropped.png

Poolnoodle-Mixup is our new main-contender for a big model with deep reasoning capabilities. It's a sparse mixture of experts model based on Mixtral 8x7, but with updated tokenizations and embeddings.

Poolnoodle-Mixup strong points are general reasoning and a deep general knowledge. With this, the model is capable of making complex decisions.

Poolnoodle-Mixup is a model that is trained to be helpful and explanative, with the appropriate instructions, it's a great model for an AI chatbot. 

With a standard context length of 16384 tokens without context-extension measures, it's capable of holding very long conversations and can parse considerable amount of text.

Despite being a relatively heavy model, it's been heavily optimized for speed and as such it outputs much faster than most LLaMa-based models.

Model Faceplate

Model name Poolnoodle-BigNoodle
Family Poolnoodle
Model Base UUID e7a79d39-f149-476f-9492-2884b3828722
Parameters Circa 56 billion
Origin Mixtral 8x7
License Mixtral 8x7 license and ScaiLabs Model License
Context Length 16384
RoPE Scaling Supported Yes, dynamic
Tokenizer scaitoken-2
Embeddings model scailar-2
Runtime compatibility HF Transformers, FastChat, vLLM, ScaiBlossom 1.3+

Model Versions

Version Version String Release date UUID
0.1 poolnoodle-mixup-0.1-alpha 2024-07-12 94df8aec-d177-4f1c-abff-01b29acc1e81