Skip to main content

Poolnoodle-CodeNoodle

poolnoodle_codenoodle_cropped.pngPoolnoodle-CodeNoodle is a Large Language Model derived from Code Llama, but has been trained and fine-tuned by ScaiLabs. It's based on the Llama 2 13B model and has about 13 billion parameters.

With a stretched context length of 16000 tokens, Poolnoodle-CodeNoodle is capable of handling pretty big queries.

Poolnoodle-CodeNoodle has been extensively trained on the following languages:

  • Python
  • JavaScript
  • HTML
  • CSS
  • C#
  • Java
  • C
  • C++
  • SQL in several dialects
  • PHP
  • Ruby
  • Swift
  • Rust
  • Go
  • Perl
  • Haskel
  • Scala
  • VisualBasic.net
  • Pascal/Delphi
  • Erlang
  • Bash-Scripting
  • PowerShell

Poolnoodle-CodeNoodle has been trained to both interpret existing code and to write code based on your descriptions. Please notice that while it can solve pretty complex coding problems based on text, it has no real visual capabilities.