class OpenAI::TranscriptionRequest

Overview

TranscriptionRequest represents a request structure for audio API

Included Modules

Extended Modules

Defined in:

openai/api/audio.cr

Constructors

Instance Method Summary

Constructor Detail

def self.new(pull : JSON::PullParser) #

[View source]
def self.new(file : File | Path | String, model : String = "whisper-1", prompt : Nil | String = nil, response_format : OpenAI::TranscriptionRespFormat = TranscriptionRespFormat::JSON, temperature : Float64 = 0.0, language : Nil | String = nil) #

[View source]

Instance Method Detail

def build_metada(builder : HTTP::FormData::Builder) #

[View source]
def file : File | Path | String #

[View source]
def file=(file : File | Path | String) #

[View source]
def language : String | Nil #

The language of the input audio. Supplying the input language in ISO-639-1 format will improve accuracy and latency.


[View source]
def language=(language : String | Nil) #

The language of the input audio. Supplying the input language in ISO-639-1 format will improve accuracy and latency.


[View source]
def model : String #

ID of the model to use. Only whisper-1 is currently available.


[View source]
def model=(model : String) #

ID of the model to use. Only whisper-1 is currently available.


[View source]
def prompt : String | Nil #

An optional text to guide the model's style or continue a previous audio segment. The prompt should match the audio language.


[View source]
def prompt=(prompt : String | Nil) #

An optional text to guide the model's style or continue a previous audio segment. The prompt should match the audio language.


[View source]
def response_format : TranscriptionRespFormat #

The format of the transcript output, in one of these options: json, text, srt, verbose_json, or vtt.


[View source]
def response_format=(response_format : TranscriptionRespFormat) #

The format of the transcript output, in one of these options: json, text, srt, verbose_json, or vtt.


[View source]
def temperature : Float64 #

The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use log probability to automatically increase the temperature until certain thresholds are hit.


[View source]
def temperature=(temperature : Float64) #

The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use log probability to automatically increase the temperature until certain thresholds are hit.


[View source]