Temperature

In the context of Generative AI, "Temperature" refers to a parameter in the probability distribution function used during the generation process. It controls the randomness of predictions by scaling the logits before applying softmax. A high temperature value results in more random outputs, while a low temperature value makes the model's outputs more deterministic and similar to the training data. It's a crucial factor in balancing between diversity and quality of generated content.

In an analog to human cogitation one could compare the temperature value to the level of creativity, where a high temperature value would result in more creative outputs, while a low temperature value would result in more objective outputs.

The LLM Knowledge Base is a collection of bite-sized explanations for commonly used terms and abbreviations related to Large Language Models and Generative AI.

It's an educational resource that helps you stay up-to-date with the latest developments in AI research and its applications.

Promptmetheus © 2023-2024.
All rights reserved.