Skip to main content

Poolnoodle-CodeNoodle

poolnoodle_codenoodle_cropped.pngPoolnoodle-CodeNoodle is a Large Language Model derived from Code Llama, but has been trained and fine-tuned by ScaiLabs. It's based on the Llama 2 13B model and has about 13 billion parameters.

With a stretched context length of 16000 tokens, Poolnoodle-CodeNoodle is capable of handling pretty big queries.

Poolnoodle-CodeNoodle has been extensively trained on the following languages:

  • Python
  • JavaScript
  • HTML
  • CSS
  • C#
  • Java
  • C
  • C++
  • SQL in several dialects
  • PHP
  • Ruby
  • Swift
  • Rust
  • Go
  • Perl
  • Haskel
  • Scala
  • VisualBasic.net
  • Pascal/Delphi
  • Erlang
  • Bash-Scripting
  • PowerShell

Poolnoodle-CodeNoodle has been trained to both interpret existing code and to write code based on your descriptions. Please notice that while it can solve pretty complex coding problems based on text, it has no real visual capabilities.

Model Faceplate

Model name Poolnoodle-CodeNoodle
Family Poolnoodle
Model Base UUID d9cb603a-7f7a-4555-8cad-d99532e2b40f
Parameters Circa 13B
Origin Llama 2 CodeLama
License LLama 2 license and ScaiLabs Model License
Context Length 16384
RoPE Scaling Supported No
Tokenizer Llama 2 tokenizer
Embeddings model Llama 2 embeddings
Runtime compatibility HF Transformers, FastChat

Model Versions

Version Version String Release date UUID
0.1 poolnoodle-0.1-codenoodle 2024-01-31 4f74f8ba-6537-4633-bd02-b8d4ddb6d545
0.2 poolnoodle-0.2-codenoodle 2024-02-07 219289e9-648a-4e5c-bf25-6f38499dc5cf