React Native RAG - v0.2.0
    Preparing search index...

    Interface LLM

    Defines the essential operations for a Large Language Model (LLM). This interface provides a standardized way to interact with various LLM implementations, covering model lifecycle (loading, unloading) and core text generation capabilities. It supports streaming of generated tokens for interactive applications.

    interface LLM {
        generate: (
            messages: Message[],
            callback: (token: string) => void,
        ) => Promise<string>;
        interrupt: () => Promise<void>;
        load: () => Promise<LLM>;
        unload: () => Promise<void>;
    }

    Implemented by

    Index

    Properties

    generate: (
        messages: Message[],
        callback: (token: string) => void,
    ) => Promise<string>

    Generates a text response based on a sequence of messages. The callback function allows for streaming tokens as they are generated.

    Type Declaration

      • (messages: Message[], callback: (token: string) => void): Promise<string>
      • Parameters

        • messages: Message[]

          An array of message objects representing the conversation history.

        • callback: (token: string) => void

          A function that is called with each new token generated by the LLM.

        Returns Promise<string>

        A promise that resolves to the complete generated string.

    interrupt: () => Promise<void>

    Interrupts any ongoing text generation process. This can be useful for stopping long-running generations prematurely.

    Type Declaration

      • (): Promise<void>
      • Returns Promise<void>

        A promise that resolves once the interruption is complete.

    load: () => Promise<LLM>

    Loads the LLM model resources (e.g., weights, tokenizer) into memory. This should be called before attempting to generate text.

    Type Declaration

      • (): Promise<LLM>
      • Returns Promise<LLM>

        A promise that resolves to the instance of the LLM once loaded.

    unload: () => Promise<void>

    Unloads the LLM and its associated resources from memory. This is typically used to free up system resources when the model is no longer needed.

    Type Declaration

      • (): Promise<void>
      • Returns Promise<void>

        A promise that resolves once the model unloading is complete.