Skip to main content

About our AI

Meet our team of Poolnoodles! These are our in house models, trained for specific tasks and purposes.

While we take great care in training our models, they still are just that: AIs. And like all the AIs out there, sometimes a mistake or surprise is possible. If you see one of our Noodles misbehaving, please let us know via the Feedback buttons.


Currently, we have several models available, each with their own purpose and characteristics. Please note that not all models might be available to you. What models are used depends on the desired use-cases and personas implemented for your organization. 

So, let's briefly introduce our Poolnoodle friends. If you would like to know about one of the models, or if using one could benefit your organisation, please contact us directly.

BigNoodle

poolnoodle_bignoodle_cropped.pngBigNoodle is our all-round workhorse. It's a general purpose model, trained in different languages and many generic capabilities. It's strong points are general reasoning and deep general knowledge. The model finds it's roots with Llama 2.

The model is trained to be helpful and explanative and, with the appropriate instructions, is a great model for an AI chatbot. With it's context length of more than 8000 tokens it is capable of holding pretty long conversations and/or parse considerable amounts of text.

With 75 billion parameters, it's not the most lean model, and as such it's also not the fastest. Use BigNoodle when you need deep reasoning and deep insights.

CodeNoodle

poolnoodle_codenoodle_cropped.pngCodeNoodle is a Large Language Model derived from Code Llama, but has been trained and fine-tuned by ScaiLabs. It's based on the Llama 2 13B model and has about 13 billion parameters.

With a stretched context length of 16000 tokens, CodeNoodle is capable of handling pretty big queries. CodeNoodle has been trained on several modern and/or popular programming and scripting languages. Some of the more notable examples include Python, JavaScript, HTML, CSS, C#, SQL (in several dialects), PHP, Delphi, PowerShell, and several others.

CodeNoodle has been trained to both interpret existing code and to write code based on your descriptions. Please notice that while it can solve pretty complex coding problems based on text, it has no real visual capabilities. Hence the glasses.

LongNoodle

poolnoodle_longnoodle_cropped.pngLongNoodle is a colleague model of BigNoodle, but with far less parameters. It has been trained on mostly the same datasets though. The difference between LongNoodle and BigNoodle is that LongNoodle supports more than 128K tokens in its context length.

With this large context length, LongNoodle's primary expertise is to read, process and summarize big chunks of text and data.

LongNoodle is capable of extracting information from formatted text like HTML, markup and other structured text formats, as long as the source material is clear text.

The context length of LongNoodle can be extended even further via RoPE Scaling, although this comes at a huge expense of memory.

FixerNoodle

poolnoodle_fixernoodle_cropped.pngFixerNoodle is an LLM trained to route requests between LLMs and plugins. It's a 13 billion parameter model, based on the original Poolnoodle model, which has its roots in the Vicuna 13B model.

The model has been trained to interact with several so called "plugins", which allow the model to use external resources.

The model has been optimized for speed, rather than context length or deep understanding. It's capable of summarizing information within its context window.

ToolNoodle

poolnoodle_toolnoodle_cropped.pngToolNoodle is a model featuring 13 Billion parameters that has been trained to interact with APIs. It has been trained to understand both API documentation, but also API definitions like OpenAPI/Swagger, RAML and SOAP.

ToolNoodle isn't only capable of understanding those APIs and explain them to you, it is also able to formulate and with the right permissions, execute requests on those APIs and interpret the results, if necessary.

ToolNoodle can be used stand-alone, but also has been optimized to be used in a chain of other Large Language Models.

BabyNoodle

poolnoodle_babynoodle_cropped.pngBabyNoodle is our smallest LLM model, optimized to be able to run on embedded systems. This model has been built from the ground up by ScaiLabs and has been trained on a set of basic capabilities.

The primary language the model supports is English, with good understanding of German and French. While the model has some understanding of Dutch, it isn't good enough to hold a conversation in Dutch yet.

The model has been trained to interact with its brothers, BigNoodle, FixerNoodle and ToolNoodle, if its own capabilities aren't sufficient to complete the request. Furthermore its capable to call upon a set of local plugins in order to execute calls.