| azure_openai | Azure OpenAI Endpoint Provider Function |
| azure_openai_chat | Send LLM Messages to an Azure OpenAI Chat Completions endpoint |
| azure_openai_embedding | Generate Embeddings Using OpenAI API on Azure |
| cancel_openai_batch | Cancel an In-Progress OpenAI Batch |
| chat | Chat with a Language Model |
| chat_ellmer | Send LLM Messages to Ellmer Chat Object |
| check_azure_openai_batch | Check Batch Processing Status for Azure OpenAI Batch API |
| check_batch | Check Batch Processing Status |
| check_claude_batch | Check Batch Processing Status for Claude API |
| check_gemini_batch | Check the Status of a Gemini Batch Operation |
| check_groq_batch | Check Batch Processing Status for Groq API |
| check_job | Check the Status of a Batch or Research Job |
| check_mistral_batch | Check Batch Processing Status for Mistral Batch API |
| check_openai_batch | Check Batch Processing Status for OpenAI Batch API |
| claude | Provider Function for Claude models on the Anthropic API |
| claude_chat | Interact with Claude AI models via the Anthropic API |
| claude_delete_file | Delete a File from Claude API |
| claude_file_metadata | Retrieve Metadata for a File from Claude API |
| claude_list_files | List Files in Claude API |
| claude_list_models | List Available Models from the Anthropic Claude API |
| claude_upload_file | Upload a File to Claude API |
| claude_websearch | Builtin Claude Web Search Tool |
| deepseek | Deepseek Provider Function |
| deepseek_chat | Send LLM Messages to the DeepSeek Chat API |
| deep_research | Run Deep Research via a Provider |
| df_llm_message | Convert a Data Frame to an LLMMessage Object |
| ellmer | Alias for the Ellmer Provider Function |
| ellmer_tool | Convert an ellmer Tool to a tidyllm TOOL |
| embed | Generate text embeddings |
| fetch_azure_openai_batch | Fetch Results for an Azure OpenAI Batch |
| fetch_batch | Fetch Results from a Batch API |
| fetch_claude_batch | Fetch Results for a Claude Batch |
| fetch_gemini_batch | Fetch Results for a Gemini Batch |
| fetch_groq_batch | Fetch Results for a Groq Batch |
| fetch_job | Fetch Results from a Batch or Research Job |
| fetch_mistral_batch | Fetch Results for an Mistral Batch |
| fetch_openai_batch | Fetch Results for an OpenAI Batch |
| field_chr | Define Field Descriptors for JSON Schema |
| field_dbl | Define Field Descriptors for JSON Schema |
| field_fct | Define Field Descriptors for JSON Schema |
| field_lgl | Define Field Descriptors for JSON Schema |
| field_object | Define a nested object field |
| gemini | Google Gemini Provider Function |
| gemini_chat | Send LLMMessage to Gemini API |
| gemini_delete_file | Delete a File from Gemini API |
| gemini_embedding | Generate Embeddings Using the Google Gemini API |
| gemini_file_metadata | Retrieve Metadata for a File from Gemini API |
| gemini_list_files | List Files in Gemini API |
| gemini_list_models | List Available Models from the Google Gemini API |
| gemini_upload_file | Upload a File to Gemini API |
| get_logprobs | Retrieve Log Probabilities from Assistant Replies |
| get_metadata | Retrieve Metadata from Assistant Replies |
| get_reply | Retrieve Assistant Reply as Text |
| get_reply_data | Retrieve Assistant Reply as Structured Data |
| get_user_message | Retrieve a User Message by Index |
| groq | Groq API Provider Function |
| groq_chat | Send LLM Messages to the Groq Chat API |
| groq_list_models | List Available Models from the Groq API |
| groq_transcribe | Transcribe an Audio File Using Groq transcription API |
| img | Create an Image Object |
| last_metadata | Retrieve Metadata from Assistant Replies |
| last_reply | Retrieve Assistant Reply as Text |
| last_reply_data | Retrieve Assistant Reply as Structured Data |
| last_user_message | Retrieve a User Message by Index |
| list_azure_openai_batches | List Azure OpenAI Batch Requests |
| list_batches | List all Batch Requests on a Batch API |
| list_claude_batches | List Claude Batch Requests |
| list_gemini_batches | List Recent Gemini Batch Operations |
| list_groq_batches | List Groq Batch Requests |
| list_hf_gguf_files | List GGUF Files Available in a Hugging Face Repository |
| list_mistral_batches | List Mistral Batch Requests |
| list_models | List Available Models for a Provider |
| list_openai_batches | List OpenAI Batch Requests |
| llamacpp | llama.cpp Provider Function |
| llamacpp_chat | Send LLM Messages to a llama.cpp Server |
| llamacpp_delete_model | Delete a Local GGUF Model File |
| llamacpp_download_model | Download a GGUF Model from Hugging Face |
| llamacpp_embedding | Generate Embeddings Using a llama.cpp Server |
| llamacpp_health | Check Health of the llama.cpp Server |
| llamacpp_list_local_models | List Local GGUF Model Files |
| llamacpp_list_models | List Models Loaded in the llama.cpp Server |
| llamacpp_rerank | Rerank Documents Using a llama.cpp Server |
| LLMMessage | Large Language Model Message Class |
| llm_message | Create or Update Large Language Model Message Object |
| mistral | Mistral Provider Function |
| mistral_chat | Send LLMMessage to Mistral API |
| mistral_embedding | Generate Embeddings Using Mistral API |
| mistral_list_models | List Available Models from the Mistral API |
| ollama | Ollama API Provider Function |
| ollama_chat | Interact with local AI models via the Ollama API |
| ollama_delete_model | Delete a model from the Ollama API |
| ollama_download_model | Download a model from the Ollama API |
| ollama_embedding | Generate Embeddings Using Ollama API |
| ollama_list_models | Retrieve and return model information from the Ollama API |
| openai | OpenAI Provider Function |
| openai_chat | Send LLM Messages to the OpenAI Chat Completions API |
| openai_embedding | Generate Embeddings Using OpenAI API |
| openai_list_models | List Available Models from the OpenAI API |
| openrouter | OpenRouter Provider Function |
| openrouter_chat | Send LLM Messages to the OpenRouter Chat API |
| openrouter_credits | Get OpenRouter Credit Balance |
| openrouter_embedding | Generate Embeddings Using the OpenRouter API |
| openrouter_generation | Get Details for an OpenRouter Generation |
| openrouter_list_models | List Available Models on OpenRouter |
| pdf_page_batch | Batch Process PDF into LLM Messages |
| perplexity | Perplexity Provider Function |
| perplexity_chat | Send LLM Messages to the Perplexity Chat API |
| perplexity_check_research | Check the Status of a Perplexity Deep Research Job |
| perplexity_deep_research | Submit a Deep Research Request to Perplexity |
| perplexity_fetch_research | Fetch Results from a Completed Perplexity Deep Research Job |
| rate_limit_info | Get the current rate limit information for all or a specific API |
| send_azure_openai_batch | Send a Batch of Messages to Azure OpenAI Batch API |
| send_batch | Send a batch of messages to a batch API |
| send_claude_batch | Send a Batch of Messages to Claude API |
| send_gemini_batch | Submit a list of LLMMessage objects to Gemini's batch API |
| send_groq_batch | Send a Batch of Messages to the Groq API |
| send_mistral_batch | Send a Batch of Requests to the Mistral API |
| send_ollama_batch | Send a Batch of Messages to Ollama API |
| send_openai_batch | Send a Batch of Messages to OpenAI Batch API |
| tidyllm_schema | Create a JSON Schema for Structured Outputs |
| tidyllm_tool | Create a Tool Definition for tidyllm |
| voyage | Voyage Provider Function |
| voyage_embedding | Generate Embeddings Using Voyage AI API |
| voyage_rerank | Rerank Documents Using Voyage AI API |