Skip to content

getmirai/sdk-ios

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mirai

Listen to our Podcast View our Deck Contact Us

Mirai is an SDK for highly optimized on-device LLM inference.

Usage

Download model

import Mirai

let engine = MiraiEngine.shared
let model = ModelsRegistry.llama_3_2_1b_instruct
guard let files = try await engine.storage.download(model: model) else {
    return
}

Create inference session

  • You can optionally select a configuration for your specific use case from the following list: general, chat, summarization, classification.
  • Choosing the appropriate configuration significantly increases inference speed by utilizing specialized optimization techniques.
let session = try await engine.session(model: model, modelFiles: files)
try await session.updateConfiguration(.forType(.general))
try await session.load()

Run

If you want to get an answer for a specific prompt, use the text input:

try await session.run(input: .text("Ultimate Question of Life, the Universe, and Everything")) { result in
    if result.finished {
        print(result.text)
    }
}

You can also use a list of messages as an input:

let messages = [
    Message(text: "Hi", role: .user),
    Message(text: "How can I help you?", role: .assistant),
    Message(text: "Tell me a story", role: .user)
]
try await session.run(input: .messages(messages)) { result in
    if result.finished {
        print(result.text)
    }
}

Thank you!

If you have any questions, just drop us a message at [email protected].

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages