steampipe plugin install mr-destructive/cohereai

Table: cohereai_generation

Create generations for a given text prompt.

Notes:

  • A prompt or settings -> 'prompt' where qualifier is required for all queries.
  • The fields likelihood and token_likelihoods will only return values if return_likelihoods is set either as GENERATION or ALL.*

The return_likelihoods can be set to GENERATION or ALL. If the former is selected, the API would respond with the likelihood for only the generation text, else for the later, the likelihood will br given for both generated and prompt text (Default in the plugin is GENERATION).

Examples

Basic example with simple prompt

select
generation
from
cohereai_generation
where
prompt = 'Give suggestions for a title for a science-fiction novel.';

Generation with specific settings(model, number of responses, etc.)

select
generation
from
cohereai_generation
where
settings = '{
"model": "command-light",
"num_generations": 3,
"max_tokens": 100,
"temperature": 0.9,
"top_p": 1.0,
"frequency_penalty": 0.0 }'
and prompt = 'Give suggestions for a title for a science-fiction novel.';

Pass the prompt string through settings

select
generation
from
cohereai_generation
where
settings = '{
"prompt": "Write app ideas for AI-realted domains."}';

Spell check a piece of text

select
generation
from
cohereai_generation
where
settings = '{"num_generations": 1}'
and prompt = 'Check the smaple. Sample: "The impotance of effictive comunication. This is an exmaple artcile abot missplelled wrds."';

Schema for cohereai_generation

NameTypeOperatorsDescription
_ctxjsonbSteampipe context in JSON form, e.g. connection_name.
generationtextGeneration for a given text prompt.
likelihooddouble precisionThe likelihood of the generated text.
prompttext=The prompt to get generations for, encoded as a string.
settingsjsonb=Settings is a JSONB object that accepts any of the generate API request parameters.
token_likelihoodsjsonbThe likelihood of the generated tokens/prompt.