๐Ÿ“ฆ ionic-team / capacitor-local-llm

โ˜… 3 stars โ‘‚ 0 forks ๐Ÿ‘ 3 watching โš–๏ธ Other
๐Ÿ“ฅ Clone https://github.com/ionic-team/capacitor-local-llm.git
HTTPS git clone https://github.com/ionic-team/capacitor-local-llm.git
SSH git clone git@github.com:ionic-team/capacitor-local-llm.git
CLI gh repo clone ionic-team/capacitor-local-llm
Loading files...
๐Ÿ“„ README.md

@capacitor/local-llm

Capacitor Local LLM plugin

Install

npm install @capacitor/local-llm
npx cap sync

API

The main plugin interface for interacting with on-device LLMs.

systemAvailability()

systemAvailability() => Promise<SystemAvailabilityResponse>

Checks the availability status of the on-device LLM.

Use this method to determine if the LLM is ready to use, needs to be downloaded, or is unavailable on the device.

Returns: Promise<SystemAvailabilityResponse>

Since: 1.0.0


download()

download() => Promise<void>

Downloads the on-device LLM model.

This method initiates the download of the LLM model when it's not already present on the device. Only available on Android.

Since: 1.0.0


prompt(...)

prompt(options: PromptOptions) => Promise<PromptResponse>

Sends a prompt to the on-device LLM and receives a response.

Use this method to interact with the LLM. You can optionally provide a sessionId to maintain conversation context across multiple prompts.

ParamTypeDescription
optionsPromptOptions- The prompt options including the text prompt and optional configuration
Returns: Promise<PromptResponse>

Since: 1.0.0


endSession(...)

endSession(options: EndSessionOptions) => Promise<void>

Ends an active LLM session.

Use this method to clean up resources when you're done with a conversation session. This is important for managing memory and preventing resource leaks.

ParamTypeDescription
optionsEndSessionOptions- The options containing the sessionId to end
Since: 1.0.0


generateImage(...)

generateImage(options: GenerateImageOptions) => Promise<GenerateImageResponse>

Generates images from a text prompt using the on-device LLM.

Use this method to create images based on text descriptions. Optionally provide reference images to influence the generation. The generated images are returned as base64-encoded PNG strings in an array.

ParamTypeDescription
optionsGenerateImageOptions- The image generation options including the prompt, optional reference images, and count
Returns: Promise<GenerateImageResponse>

Since: 1.0.0


Interfaces

SystemAvailabilityResponse

Response containing the system availability status of the on-device LLM.

PropTypeDescriptionSince
statusLLMAvailabilityThe current availability status of the LLM.1.0.0

PromptResponse

Response from the LLM after processing a prompt.

PropTypeDescriptionSince
textstringThe text response generated by the LLM.1.0.0

PromptOptions

Options for sending a prompt to the LLM.

PropTypeDescriptionSince
sessionIdstringOptional session identifier for maintaining conversation context. Provide the same sessionId across multiple prompts to maintain context. If not provided, each prompt is treated as independent.1.0.0
instructionsstringSystem-level instructions to guide the LLM's behavior. Use this to set the role, tone, or constraints for the LLM's responses.1.0.0
optionsLLMOptionsConfiguration options for controlling LLM inference behavior.1.0.0
promptstringThe text prompt to send to the LLM.1.0.0

LLMOptions

Configuration options for LLM inference behavior.

PropTypeDescriptionSince
temperaturenumberControls randomness in the model's output. Higher values (e.g., 0.8) make output more random, while lower values (e.g., 0.2) make it more focused and deterministic.1.0.0
maximiumOutputTokensnumberThe maximum number of tokens to generate in the response. Note: This property name contains a typo ("maximium" instead of "maximum") but is kept for API consistency.1.0.0

EndSessionOptions

Options for ending an active LLM session.

PropTypeDescriptionSince
sessionIdstringThe identifier of the session to end. This should match the sessionId used in previous prompt() calls.1.0.0

GenerateImageResponse

Response containing the generated image data.

PropTypeDescriptionSince
pngBase64Imagesstring[]Array of generated images as base64-encoded PNG strings. Each string contains raw base64 data (without data URI prefix). To use in an img tag, prefix with 'data:image/png;base64,'.1.0.0

GenerateImageOptions

Options for generating an image from a text prompt.

PropTypeDescriptionDefaultSince
promptstringThe text prompt describing the image to generate.1.0.0
promptImagesstring[]Optional array of reference images to influence the generated output. Provide base64-encoded image strings (with or without data URI prefix) that will be used as visual context or inspiration for the image generation. This allows you to combine text and image concepts for more controlled output.1.0.0
countnumberThe number of image variations to generate. Defaults to 1 if not specified.11.0.0

Type Aliases

LLMAvailability

Availability status of the on-device LLM.

'available' | 'unavailable' | 'notready' | 'downloadable' | 'responding'