Made the decision to initially stick to the Ollama LLM backend and not go down the whole c++ download and compile for MLC LLM usage. Update to the UI to now include the model downloaded for the Llamatik backend to use. Tested as working and downloading models successfully.
This commit is contained in:
@@ -5,4 +5,5 @@ plugins {
|
||||
alias(libs.plugins.kotlin.compose) apply false
|
||||
// Add the Chaquopy plugin here
|
||||
id("com.chaquo.python") version "15.0.1" apply false
|
||||
id("com.google.devtools.ksp") version "2.2.0-2.0.2" apply false
|
||||
}
|
||||
Reference in New Issue
Block a user