|
|
8ba9cb7a62
|
Check in part way through the engine update to MediaPipe. Have the system working and now troubleshooting the LLM models and sizes and how they handle the JSON and tool calling loops...Check in to allow dev on the laptop as well from here.
|
2026-03-04 18:58:03 +11:00 |
|
|
|
93a2c48e4b
|
Made the decision to initially stick to the Ollama LLM backend and not go down the whole c++ download and compile for MLC LLM usage. Update to the UI to now include the model downloaded for the Llamatik backend to use. Tested as working and downloading models successfully.
|
2026-02-28 12:58:04 +11:00 |
|
|
|
25167ff6cb
|
Large changes to the UI ( and matching backend) to allow for chat threads, system prompt changes, and starting the MLC LLM integration options for the backend AI switch/choice
|
2026-02-28 10:30:54 +11:00 |
|
|
|
7c1bc79fb2
|
Now have the message box - send to langchain4j routine set up and confirmed, even without an LLM to test against so we prove that the onscreen workflow is OK
|
2026-02-26 10:33:33 +11:00 |
|
|
|
3f281d0b5b
|
Basic gradle set up integration.
|
2026-02-25 11:31:46 +11:00 |
|
|
|
3f9ef6d69b
|
Initial empty project commit.
|
2026-02-25 11:21:42 +11:00 |
|