Skip to main content

Poolnoodle-LongNoodle

poolnoodle_longnoodle_cropped.pngPoolnoodle-LongNoodle is a colleague model of Poolnoodle-BigNoodle, but with far less parameters. It has been trained on mostly the same datasets though. The difference between LongNoodle and BigNoodle is that LongNoodle supports more than 128K tokens in its context length.

With this large context length, Poolnoodle-LongNoodle's primary expertise is to read, process and summarize big chunks of text and data.

Poolnoodle-LongNoodlei is capable of extracting information from formatted text like HTML, markup and other structured text formats, as long as the source material is clear text.

The context length of Poolnoodle-LongNoodle can be extended even further via RoPE Scaling, although this comes at a huge expense of memory.

Model Faceplate

Model name Poolnoodle-LongNoodle
Family Poolnoodle
Model Base UUID 9aad7c4e-85b7-472d-a362-7528054435c5
Parameters Circa 13 billion
Origin Llama 2
License LLama 2 license and ScaiLabs Model License
Context Length 131072
RoPE Scaling Supported Yes, dynamic
Tokenizer scaitoken-1
Embeddings model scailar-1
Runtime compatibility HF Transformers, FastChat

Model Versions

Version Version String Release date UUID
0.1 poolnoodle-0.1-longnoodle 2023-12-22 fa1c2c2f-65be-4937-a58d-9c874a74a993
0.2 poolnoodle-0.2-longnoodle 2023-10-13 4954b497-87f0-4c0e-b29a-fb27bfcda8d1