Made the decision to initially stick to the Ollama LLM backend and not go down the whole c++ download and compile for MLC LLM usage. Update to the UI to now include the model downloaded for the Llamatik backend to use. Tested as working and downloading models successfully.
This commit is contained in:
@@ -1,6 +1,6 @@
|
||||
[versions]
|
||||
agp = "8.13.2"
|
||||
kotlin = "2.0.21"
|
||||
kotlin = "2.2.0"
|
||||
coreKtx = "1.17.0"
|
||||
junit = "4.13.2"
|
||||
junitVersion = "1.3.0"
|
||||
|
||||
Reference in New Issue
Block a user