Poolnoodle-Mixup
Poolnoodle-BigNoodleMixup is our biggestnew standardmain-contender transformerfor model,a featuringbig aboutmodel 75with billiondeep parameters.reasoning capabilities. It's a generalsparse purposemixture model,of trainedexperts inmodel manybased languageson Mixtral 8x7, but with updated tokenizations and many generic capabilities.embeddings.
Poolnoodle-BigNoodle'sMixup strong points are general reasoning and a deep general knowledge. With this, the model is capable of making complex decisions.
Poolnoodle-BigNoodleMixup is a model that is trained to be helpful and explanative, with the appropriate instructions, it's a great model for an AI chatbot.
With a standard context length of 819216384 tokens without context-extension measures, it's capable of holding prettyvery long conversations and can parse considerable amount of text.
WithDespite 75being billiona parameters,relatively heavy model, it's notbeen theheavily mostoptimized leanfor modelspeed and as such it'sit notoutputs themuch fastest.faster Usethan Poolnoodle-BigNoodlemost whenLLaMa-based you need deep reasoning and deep insights.models.
Model Faceplate
Model name | Poolnoodle-BigNoodle |
Family | Poolnoodle |
Model Base UUID | |
Parameters | Circa |
Origin | |
License | |
Context Length | |
RoPE Scaling Supported | Yes, dynamic |
Tokenizer | scaitoken- |
Embeddings model | scailar- |
Runtime compatibility | HF Transformers, FastChat, vLLM, ScaiBlossom 1.3+ |
Model Versions
Version | Version String | Release date | UUID |
0. |
poolnoodle-mixup-0. |
||