Compare commits
129 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f4f91d9aa1 | ||
|
|
ad5f73e985 | ||
|
|
d06ac2c2eb | ||
|
|
343411cd8c | ||
|
|
e14750db44 | ||
|
|
3be3b21f93 | ||
|
|
5c28245ff3 | ||
|
|
b433091e90 | ||
|
|
fe22f34a3d | ||
|
|
ef395a27a3 | ||
|
|
362e62ee36 | ||
|
|
80aabcb805 | ||
|
|
c283f3f1d2 | ||
|
|
603fdb8150 | ||
|
|
6e0ae393a4 | ||
|
|
abbdbf1abe | ||
|
|
a68973ece6 | ||
|
|
50aad8cb6d | ||
|
|
79a7840f24 | ||
|
|
19a8aade60 | ||
|
|
a591270d58 | ||
|
|
1087d3e336 | ||
|
|
fb7393fe5c | ||
|
|
d3159ae6ea | ||
|
|
c2c047d8b7 | ||
|
|
e897d6c931 | ||
|
|
3ceb25ccc6 | ||
|
|
9d740a7db9 | ||
|
|
4e77898487 | ||
|
|
c913a25679 | ||
|
|
758e055f1c | ||
|
|
a08be7351c | ||
|
|
08c0c54b98 | ||
|
|
687b99f9ab | ||
|
|
5801d43af9 | ||
|
|
5a28a16119 | ||
|
|
47e58d2ccd | ||
|
|
d33e9c53d8 | ||
|
|
8869296aed | ||
|
|
da4dd3341a | ||
|
|
15aa7fb844 | ||
|
|
d74f535968 | ||
|
|
521a95fc6b | ||
|
|
9548e2ec40 | ||
|
|
f1a3c7136f | ||
|
|
4542f26bb7 | ||
|
|
4926cb157e | ||
|
|
0d3b544a73 | ||
|
|
daf56c2de4 | ||
|
|
46e3921585 | ||
|
|
707984e20d | ||
|
|
cfb79a70be | ||
|
|
c32d0acfd4 | ||
|
|
38de4ac18e | ||
|
|
923c0e52e2 | ||
|
|
e753591b45 | ||
|
|
12e754e4bc | ||
|
|
eb3919ad63 | ||
|
|
1b94864422 | ||
|
|
51a90e0b79 | ||
|
|
7264902199 | ||
|
|
f08b03308e | ||
|
|
f831466d87 | ||
|
|
a54ec6fa9d | ||
|
|
ed3136cffd | ||
|
|
e37c5acbf9 | ||
|
|
9d2ad2eb3a | ||
|
|
d1a0d6375b | ||
|
|
ebf3af38c8 | ||
|
|
80b433e7a4 | ||
|
|
5da5c2c702 | ||
|
|
fe8626f650 | ||
|
|
ee998d978f | ||
|
|
cee360d5a2 | ||
|
|
5098babfd2 | ||
|
|
7026655116 | ||
|
|
01ba38a23b | ||
|
|
ef2c2650b0 | ||
|
|
a7d955a5bf | ||
|
|
129677d27c | ||
|
|
9ae011a31b | ||
|
|
abf1253980 | ||
|
|
e0c7e9c771 | ||
|
|
6a4c98ef18 | ||
|
|
809e23fb9c | ||
|
|
69eaa56240 | ||
|
|
df2a9d7b26 | ||
|
|
da97a7c6ee | ||
|
|
bb7a8b659a | ||
|
|
0999a64356 | ||
|
|
4647e1ba47 | ||
|
|
ea0caf03d6 | ||
|
|
1c9ce2a117 | ||
|
|
01b38fa37a | ||
|
|
80d1149932 | ||
|
|
cc1500f007 | ||
|
|
d0735de129 | ||
|
|
bbf678cb75 | ||
|
|
a44a7b5044 | ||
|
|
427ecd4499 | ||
|
|
0a5bb0b97f | ||
|
|
252b76e7eb | ||
|
|
5cf2be2b7d | ||
|
|
7f9a5eb516 | ||
|
|
9ecfbe8c3f | ||
|
|
377adc8699 | ||
|
|
8598f73be7 | ||
|
|
6f9b3b7c02 | ||
|
|
db198e10c0 | ||
|
|
d2271a7ade | ||
|
|
8d3d650ecf | ||
|
|
4545f5a1b2 | ||
|
|
62c1354f8b | ||
|
|
2bd99860f2 | ||
|
|
8026550f7a | ||
|
|
68c03176d4 | ||
|
|
ed54b2846a | ||
|
|
ff927d6c77 | ||
|
|
bd006da0c1 | ||
|
|
a409800279 | ||
|
|
5d89ccc729 | ||
|
|
fef3926ce3 | ||
|
|
c95be9611d | ||
|
|
1c4fc4341e | ||
|
|
3ddc172437 | ||
|
|
0d65cf1cbc | ||
|
|
cddcf496b2 | ||
|
|
9333b31444 | ||
|
|
cbd3e90073 |
6
.github/ISSUE_TEMPLATE/bug_report.md
vendored
6
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -16,5 +16,7 @@ A clear and concise description of what you expected to happen.
|
|||||||
**Screenshots**
|
**Screenshots**
|
||||||
If applicable, add screenshots to help explain your problem.
|
If applicable, add screenshots to help explain your problem.
|
||||||
|
|
||||||
**Additional context**
|
**Debugging information**
|
||||||
Add any other context about the problem here.
|
```
|
||||||
|
Please paste here the debugging information available at 'About Alpaca' > 'Troubleshooting' > 'Debugging Information'
|
||||||
|
```
|
||||||
|
|||||||
23
README.md
23
README.md
@@ -35,6 +35,24 @@ Normal conversation | Image recognition | Code highlighting | YouTube transcript
|
|||||||
:------------------:|:-----------------:|:-----------------:|:---------------------:|:----------------:
|
:------------------:|:-----------------:|:-----------------:|:---------------------:|:----------------:
|
||||||
 |  |  |  | 
|
 |  |  |  | 
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### Flathub
|
||||||
|
|
||||||
|
You can find the latest stable version of the app on [Flathub](https://flathub.org/apps/com.jeffser.Alpaca)
|
||||||
|
|
||||||
|
### Flatpak Package
|
||||||
|
|
||||||
|
Everytime a new version is published they become available on the [releases page](https://github.com/Jeffser/Alpaca/releases) of the repository
|
||||||
|
|
||||||
|
### Building Git Version
|
||||||
|
|
||||||
|
Note: This is not recommended since the prerelease versions of the app often present errors and general instability.
|
||||||
|
|
||||||
|
1. Clone the project
|
||||||
|
2. Open with Gnome Builder
|
||||||
|
3. Press the run button (or export if you want to build a Flatpak package)
|
||||||
|
|
||||||
## Translators
|
## Translators
|
||||||
|
|
||||||
Language | Contributors
|
Language | Contributors
|
||||||
@@ -47,6 +65,11 @@ Language | Contributors
|
|||||||
🇮🇳 Bengali | [Aritra Saha](https://github.com/olumolu)
|
🇮🇳 Bengali | [Aritra Saha](https://github.com/olumolu)
|
||||||
🇨🇳 Simplified Chinese | [Yuehao Sui](https://github.com/8ar10der) , [Aleksana](https://github.com/Aleksanaa)
|
🇨🇳 Simplified Chinese | [Yuehao Sui](https://github.com/8ar10der) , [Aleksana](https://github.com/Aleksanaa)
|
||||||
🇮🇳 Hindi | [Aritra Saha](https://github.com/olumolu)
|
🇮🇳 Hindi | [Aritra Saha](https://github.com/olumolu)
|
||||||
|
🇹🇷 Turkish | [YusaBecerikli](https://github.com/YusaBecerikli)
|
||||||
|
🇺🇦 Ukrainian | [Simon](https://github.com/OriginalSimon)
|
||||||
|
🇩🇪 German | [Marcel Margenberg](https://github.com/MehrzweckMandala)
|
||||||
|
|
||||||
|
Want to add a language? Visit [this discussion](https://github.com/Jeffser/Alpaca/discussions/153) to get started!
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -9,8 +9,20 @@
|
|||||||
"--share=ipc",
|
"--share=ipc",
|
||||||
"--socket=fallback-x11",
|
"--socket=fallback-x11",
|
||||||
"--device=dri",
|
"--device=dri",
|
||||||
"--socket=wayland"
|
"--socket=wayland",
|
||||||
|
"--filesystem=/sys/module/amdgpu:ro",
|
||||||
|
"--env=LD_LIBRARY_PATH=/app/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/sdk/llvm15/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/ollama:/app/plugins/AMD/lib/ollama"
|
||||||
],
|
],
|
||||||
|
"add-extensions": {
|
||||||
|
"com.jeffser.Alpaca.Plugins": {
|
||||||
|
"version": "1.0",
|
||||||
|
"add-ld-path": "/app/plugins/AMD/lib/ollama",
|
||||||
|
"directory": "plugins",
|
||||||
|
"no-autodownload": true,
|
||||||
|
"autodelete": true,
|
||||||
|
"subdirectories": true
|
||||||
|
}
|
||||||
|
},
|
||||||
"cleanup" : [
|
"cleanup" : [
|
||||||
"/include",
|
"/include",
|
||||||
"/lib/pkgconfig",
|
"/lib/pkgconfig",
|
||||||
@@ -117,21 +129,22 @@
|
|||||||
"name": "ollama",
|
"name": "ollama",
|
||||||
"buildsystem": "simple",
|
"buildsystem": "simple",
|
||||||
"build-commands": [
|
"build-commands": [
|
||||||
"install -Dm0755 ollama* ${FLATPAK_DEST}/bin/ollama"
|
"cp -r --remove-destination * ${FLATPAK_DEST}/",
|
||||||
|
"mkdir ${FLATPAK_DEST}/plugins"
|
||||||
],
|
],
|
||||||
"sources": [
|
"sources": [
|
||||||
{
|
{
|
||||||
"type": "file",
|
"type": "archive",
|
||||||
"url": "https://github.com/ollama/ollama/releases/download/v0.3.3/ollama-linux-amd64",
|
"url": "https://github.com/ollama/ollama/releases/download/v0.3.9/ollama-linux-amd64.tgz",
|
||||||
"sha256": "2b2a4ee4c86fa5b09503e95616bd1b3ee95238b1b3bf12488b9c27c66b84061a",
|
"sha256": "b0062fbccd46134818d9d59cfa3867ad6849163653cb1171bc852c5f379b0851",
|
||||||
"only-arches": [
|
"only-arches": [
|
||||||
"x86_64"
|
"x86_64"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "file",
|
"type": "archive",
|
||||||
"url": "https://github.com/ollama/ollama/releases/download/v0.3.3/ollama-linux-arm64",
|
"url": "https://github.com/ollama/ollama/releases/download/v0.3.9/ollama-linux-arm64.tgz",
|
||||||
"sha256": "28fddbea0c161bc539fd08a3dc78d51413cfe8da97386cb39420f4f30667e22c",
|
"sha256": "8979484bcb1448ab9b45107fbcb3b9f43c2af46f961487449b9ebf3518cd70eb",
|
||||||
"only-arches": [
|
"only-arches": [
|
||||||
"aarch64"
|
"aarch64"
|
||||||
]
|
]
|
||||||
|
|||||||
@@ -5,5 +5,6 @@ Icon=com.jeffser.Alpaca
|
|||||||
Terminal=false
|
Terminal=false
|
||||||
Type=Application
|
Type=Application
|
||||||
Categories=Utility;Development;Chat;
|
Categories=Utility;Development;Chat;
|
||||||
|
Keywords=ai;ollama;llm
|
||||||
StartupNotify=true
|
StartupNotify=true
|
||||||
X-Purism-FormFactor=Workstation;Mobile;
|
X-Purism-FormFactor=Workstation;Mobile;
|
||||||
|
|||||||
@@ -78,6 +78,32 @@
|
|||||||
<url type="contribute">https://github.com/Jeffser/Alpaca/discussions/154</url>
|
<url type="contribute">https://github.com/Jeffser/Alpaca/discussions/154</url>
|
||||||
<url type="vcs-browser">https://github.com/Jeffser/Alpaca</url>
|
<url type="vcs-browser">https://github.com/Jeffser/Alpaca</url>
|
||||||
<releases>
|
<releases>
|
||||||
|
<release version="2.0.0" date="2024-09-01">
|
||||||
|
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.0</url>
|
||||||
|
<description>
|
||||||
|
<p>New</p>
|
||||||
|
<ul>
|
||||||
|
<li>Model, message and chat systems have been rewritten</li>
|
||||||
|
<li>New models are available</li>
|
||||||
|
<li>Ollama updated to v0.3.9</li>
|
||||||
|
<li>Added support for multiple chat generations simultaneously</li>
|
||||||
|
<li>Added experimental AMD GPU support</li>
|
||||||
|
<li>Added message loading spinner and new message indicator to chat tab</li>
|
||||||
|
<li>Added animations</li>
|
||||||
|
<li>Changed model manager / model selector appearance</li>
|
||||||
|
<li>Changed message appearance</li>
|
||||||
|
<li>Added markdown and code blocks to user messages</li>
|
||||||
|
<li>Added loading dialog at launch so the app opens faster</li>
|
||||||
|
<li>Added warning when device is on 'battery saver' mode</li>
|
||||||
|
<li>Added inactivity timer to integrated instance</li>
|
||||||
|
</ul>
|
||||||
|
<ul>
|
||||||
|
<li>The chat is now scrolled to the bottom when it's changed</li>
|
||||||
|
<li>Better handling of focus on messages</li>
|
||||||
|
<li>Better general performance on the app</li>
|
||||||
|
</ul>
|
||||||
|
</description>
|
||||||
|
</release>
|
||||||
<release version="1.1.1" date="2024-08-12">
|
<release version="1.1.1" date="2024-08-12">
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.1.1</url>
|
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.1.1</url>
|
||||||
<description>
|
<description>
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
project('Alpaca', 'c',
|
project('Alpaca', 'c',
|
||||||
version: '1.1.1',
|
version: '2.0.0',
|
||||||
meson_version: '>= 0.62.0',
|
meson_version: '>= 0.62.0',
|
||||||
default_options: [ 'warning_level=2', 'werror=false', ],
|
default_options: [ 'warning_level=2', 'werror=false', ],
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -4,5 +4,8 @@ pt_BR
|
|||||||
fr
|
fr
|
||||||
nb_NO
|
nb_NO
|
||||||
bn
|
bn
|
||||||
zh_CN
|
zh_Hans
|
||||||
hi
|
hi
|
||||||
|
tr
|
||||||
|
uk
|
||||||
|
de
|
||||||
|
|||||||
@@ -7,3 +7,7 @@ src/available_models_descriptions.py
|
|||||||
src/connection_handler.py
|
src/connection_handler.py
|
||||||
src/dialogs.py
|
src/dialogs.py
|
||||||
src/window.ui
|
src/window.ui
|
||||||
|
src/custom_widgets/chat_widget.py
|
||||||
|
src/custom_widgets/message_widget.py
|
||||||
|
src/custom_widgets/model_widget.py
|
||||||
|
src/custom_widgets/table_widget.py
|
||||||
1647
po/alpaca.pot
1647
po/alpaca.pot
File diff suppressed because it is too large
Load Diff
6
po/az.po
6
po/az.po
@@ -1,6 +1,6 @@
|
|||||||
# Azerbaijani translations for PACKAGE package.
|
# Azerbaijani translations for Alpaca package.
|
||||||
# Copyright (C) 2024 THE PACKAGE'S COPYRIGHT HOLDER
|
# Copyright (C) 2024 Jeffry Samuel Eduarte Rojas
|
||||||
# This file is distributed under the same license as the PACKAGE package.
|
# This file is distributed under the same license as the Alpaca package.
|
||||||
# YOUR NAME <YOUR EMAIL (optional)>, 2024.
|
# YOUR NAME <YOUR EMAIL (optional)>, 2024.
|
||||||
#
|
#
|
||||||
msgid ""
|
msgid ""
|
||||||
|
|||||||
1826
po/nb_NO.po
1826
po/nb_NO.po
File diff suppressed because it is too large
Load Diff
1675
po/pt_BR.po
1675
po/pt_BR.po
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -30,6 +30,7 @@
|
|||||||
<file alias="icons/scalable/status/image-missing-symbolic.svg">icons/image-missing-symbolic.svg</file>
|
<file alias="icons/scalable/status/image-missing-symbolic.svg">icons/image-missing-symbolic.svg</file>
|
||||||
<file alias="icons/scalable/status/update-symbolic.svg">icons/update-symbolic.svg</file>
|
<file alias="icons/scalable/status/update-symbolic.svg">icons/update-symbolic.svg</file>
|
||||||
<file alias="icons/scalable/status/down-symbolic.svg">icons/down-symbolic.svg</file>
|
<file alias="icons/scalable/status/down-symbolic.svg">icons/down-symbolic.svg</file>
|
||||||
|
<file alias="icons/scalable/status/chat-bubble-text-symbolic.svg">icons/chat-bubble-text-symbolic.svg</file>
|
||||||
<file preprocess="xml-stripblanks">window.ui</file>
|
<file preprocess="xml-stripblanks">window.ui</file>
|
||||||
<file preprocess="xml-stripblanks">gtk/help-overlay.ui</file>
|
<file preprocess="xml-stripblanks">gtk/help-overlay.ui</file>
|
||||||
</gresource>
|
</gresource>
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,6 @@
|
|||||||
descriptions = {
|
descriptions = {
|
||||||
'llama3.1': _("Llama 3.1 is a new state-of-the-art model from Meta available in 8B, 70B and 405B parameter sizes."),
|
'llama3.1': _("Llama 3.1 is a new state-of-the-art model from Meta available in 8B, 70B and 405B parameter sizes."),
|
||||||
'gemma2': _("Google Gemma 2 is now available in 2 sizes, 9B and 27B."),
|
'gemma2': _("Google Gemma 2 is a high-performing and efficient model by now available in three sizes: 2B, 9B, and 27B."),
|
||||||
'mistral-nemo': _("A state-of-the-art 12B model with 128k context length, built by Mistral AI in collaboration with NVIDIA."),
|
'mistral-nemo': _("A state-of-the-art 12B model with 128k context length, built by Mistral AI in collaboration with NVIDIA."),
|
||||||
'mistral-large': _("Mistral Large 2 is Mistral's new flagship model that is significantly more capable in code generation, mathematics, and reasoning with 128k context window and support for dozens of languages."),
|
'mistral-large': _("Mistral Large 2 is Mistral's new flagship model that is significantly more capable in code generation, mathematics, and reasoning with 128k context window and support for dozens of languages."),
|
||||||
'qwen2': _("Qwen2 is a new series of large language models from Alibaba group"),
|
'qwen2': _("Qwen2 is a new series of large language models from Alibaba group"),
|
||||||
@@ -17,89 +17,96 @@ descriptions = {
|
|||||||
'qwen': _("Qwen 1.5 is a series of large language models by Alibaba Cloud spanning from 0.5B to 110B parameters"),
|
'qwen': _("Qwen 1.5 is a series of large language models by Alibaba Cloud spanning from 0.5B to 110B parameters"),
|
||||||
'llama2': _("Llama 2 is a collection of foundation language models ranging from 7B to 70B parameters."),
|
'llama2': _("Llama 2 is a collection of foundation language models ranging from 7B to 70B parameters."),
|
||||||
'codellama': _("A large language model that can use text prompts to generate and discuss code."),
|
'codellama': _("A large language model that can use text prompts to generate and discuss code."),
|
||||||
'dolphin-mixtral': _("Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford."),
|
|
||||||
'nomic-embed-text': _("A high-performing open embedding model with a large token context window."),
|
'nomic-embed-text': _("A high-performing open embedding model with a large token context window."),
|
||||||
'llama2-uncensored': _("Uncensored Llama 2 model by George Sung and Jarrad Hope."),
|
'dolphin-mixtral': _("Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford."),
|
||||||
'phi': _("Phi-2: a 2.7B language model by Microsoft Research that demonstrates outstanding reasoning and language understanding capabilities."),
|
'phi': _("Phi-2: a 2.7B language model by Microsoft Research that demonstrates outstanding reasoning and language understanding capabilities."),
|
||||||
|
'llama2-uncensored': _("Uncensored Llama 2 model by George Sung and Jarrad Hope."),
|
||||||
'deepseek-coder': _("DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens."),
|
'deepseek-coder': _("DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens."),
|
||||||
|
'mxbai-embed-large': _("State-of-the-art large embedding model from mixedbread.ai"),
|
||||||
|
'zephyr': _("Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants."),
|
||||||
'dolphin-mistral': _("The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8."),
|
'dolphin-mistral': _("The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8."),
|
||||||
|
'starcoder2': _("StarCoder2 is the next generation of transparently trained open code LLMs that comes in three sizes: 3B, 7B and 15B parameters."),
|
||||||
'orca-mini': _("A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware."),
|
'orca-mini': _("A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware."),
|
||||||
'dolphin-llama3': _("Dolphin 2.9 is a new model with 8B and 70B sizes by Eric Hartford based on Llama 3 that has a variety of instruction, conversational, and coding skills."),
|
'dolphin-llama3': _("Dolphin 2.9 is a new model with 8B and 70B sizes by Eric Hartford based on Llama 3 that has a variety of instruction, conversational, and coding skills."),
|
||||||
'mxbai-embed-large': _("State-of-the-art large embedding model from mixedbread.ai"),
|
|
||||||
'starcoder2': _("StarCoder2 is the next generation of transparently trained open code LLMs that comes in three sizes: 3B, 7B and 15B parameters."),
|
|
||||||
'mistral-openorca': _("Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset."),
|
|
||||||
'yi': _("Yi 1.5 is a high-performing, bilingual language model."),
|
'yi': _("Yi 1.5 is a high-performing, bilingual language model."),
|
||||||
'zephyr': _("Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants."),
|
'mistral-openorca': _("Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset."),
|
||||||
'llama2-chinese': _("Llama 2 based model fine tuned to improve Chinese dialogue ability."),
|
|
||||||
'llava-llama3': _("A LLaVA model fine-tuned from Llama 3 Instruct with better scores in several benchmarks."),
|
'llava-llama3': _("A LLaVA model fine-tuned from Llama 3 Instruct with better scores in several benchmarks."),
|
||||||
'vicuna': _("General use chat model based on Llama and Llama 2 with 2K to 16K context sizes."),
|
|
||||||
'nous-hermes2': _("The powerful family of models by Nous Research that excels at scientific discussion and coding tasks."),
|
|
||||||
'tinyllama': _("The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens."),
|
|
||||||
'wizard-vicuna-uncensored': _("Wizard Vicuna Uncensored is a 7B, 13B, and 30B parameter model based on Llama 2 uncensored by Eric Hartford."),
|
|
||||||
'codestral': _("Codestral is Mistral AI’s first-ever code model designed for code generation tasks."),
|
|
||||||
'starcoder': _("StarCoder is a code generation model trained on 80+ programming languages."),
|
'starcoder': _("StarCoder is a code generation model trained on 80+ programming languages."),
|
||||||
'wizardlm2': _("State of the art large language model from Microsoft AI with improved performance on complex chat, multilingual, reasoning and agent use cases."),
|
'llama2-chinese': _("Llama 2 based model fine tuned to improve Chinese dialogue ability."),
|
||||||
|
'vicuna': _("General use chat model based on Llama and Llama 2 with 2K to 16K context sizes."),
|
||||||
|
'tinyllama': _("The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens."),
|
||||||
|
'codestral': _("Codestral is Mistral AI’s first-ever code model designed for code generation tasks."),
|
||||||
|
'wizard-vicuna-uncensored': _("Wizard Vicuna Uncensored is a 7B, 13B, and 30B parameter model based on Llama 2 uncensored by Eric Hartford."),
|
||||||
|
'nous-hermes2': _("The powerful family of models by Nous Research that excels at scientific discussion and coding tasks."),
|
||||||
'openchat': _("A family of open-source models trained on a wide variety of data, surpassing ChatGPT on various benchmarks. Updated to version 3.5-0106."),
|
'openchat': _("A family of open-source models trained on a wide variety of data, surpassing ChatGPT on various benchmarks. Updated to version 3.5-0106."),
|
||||||
'aya': _("Aya 23, released by Cohere, is a new family of state-of-the-art, multilingual models that support 23 languages."),
|
'aya': _("Aya 23, released by Cohere, is a new family of state-of-the-art, multilingual models that support 23 languages."),
|
||||||
|
'wizardlm2': _("State of the art large language model from Microsoft AI with improved performance on complex chat, multilingual, reasoning and agent use cases."),
|
||||||
'tinydolphin': _("An experimental 1.1B parameter model trained on the new Dolphin 2.8 dataset by Eric Hartford and based on TinyLlama."),
|
'tinydolphin': _("An experimental 1.1B parameter model trained on the new Dolphin 2.8 dataset by Eric Hartford and based on TinyLlama."),
|
||||||
'openhermes': _("OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets."),
|
'granite-code': _("A family of open foundation models by IBM for Code Intelligence"),
|
||||||
'wizardcoder': _("State-of-the-art code generation model"),
|
'wizardcoder': _("State-of-the-art code generation model"),
|
||||||
'stable-code': _("Stable Code 3B is a coding model with instruct and code completion variants on par with models such as Code Llama 7B that are 2.5x larger."),
|
'stable-code': _("Stable Code 3B is a coding model with instruct and code completion variants on par with models such as Code Llama 7B that are 2.5x larger."),
|
||||||
|
'openhermes': _("OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets."),
|
||||||
|
'all-minilm': _("Embedding models on very large sentence level datasets."),
|
||||||
'codeqwen': _("CodeQwen1.5 is a large language model pretrained on a large amount of code data."),
|
'codeqwen': _("CodeQwen1.5 is a large language model pretrained on a large amount of code data."),
|
||||||
|
'stablelm2': _("Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch."),
|
||||||
'wizard-math': _("Model focused on math and logic problems"),
|
'wizard-math': _("Model focused on math and logic problems"),
|
||||||
'neural-chat': _("A fine-tuned model based on Mistral with good coverage of domain and language."),
|
'neural-chat': _("A fine-tuned model based on Mistral with good coverage of domain and language."),
|
||||||
'stablelm2': _("Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch."),
|
|
||||||
'granite-code': _("A family of open foundation models by IBM for Code Intelligence"),
|
|
||||||
'all-minilm': _("Embedding models on very large sentence level datasets."),
|
|
||||||
'phind-codellama': _("Code generation model based on Code Llama."),
|
|
||||||
'dolphincoder': _("A 7B and 15B uncensored variant of the Dolphin model family that excels at coding, based on StarCoder2."),
|
|
||||||
'nous-hermes': _("General use models based on Llama and Llama 2 from Nous Research."),
|
|
||||||
'sqlcoder': _("SQLCoder is a code completion model fined-tuned on StarCoder for SQL generation tasks"),
|
|
||||||
'llama3-gradient': _("This model extends LLama-3 8B's context length from 8k to over 1m tokens."),
|
'llama3-gradient': _("This model extends LLama-3 8B's context length from 8k to over 1m tokens."),
|
||||||
'starling-lm': _("Starling is a large language model trained by reinforcement learning from AI feedback focused on improving chatbot helpfulness."),
|
'phind-codellama': _("Code generation model based on Code Llama."),
|
||||||
'yarn-llama2': _("An extension of Llama 2 that supports a context of up to 128k tokens."),
|
'nous-hermes': _("General use models based on Llama and Llama 2 from Nous Research."),
|
||||||
|
'dolphincoder': _("A 7B and 15B uncensored variant of the Dolphin model family that excels at coding, based on StarCoder2."),
|
||||||
|
'sqlcoder': _("SQLCoder is a code completion model fined-tuned on StarCoder for SQL generation tasks"),
|
||||||
'xwinlm': _("Conversational model based on Llama 2 that performs competitively on various benchmarks."),
|
'xwinlm': _("Conversational model based on Llama 2 that performs competitively on various benchmarks."),
|
||||||
'deepseek-llm': _("An advanced language model crafted with 2 trillion bilingual tokens."),
|
'deepseek-llm': _("An advanced language model crafted with 2 trillion bilingual tokens."),
|
||||||
|
'yarn-llama2': _("An extension of Llama 2 that supports a context of up to 128k tokens."),
|
||||||
'llama3-chatqa': _("A model from NVIDIA based on Llama 3 that excels at conversational question answering (QA) and retrieval-augmented generation (RAG)."),
|
'llama3-chatqa': _("A model from NVIDIA based on Llama 3 that excels at conversational question answering (QA) and retrieval-augmented generation (RAG)."),
|
||||||
'orca2': _("Orca 2 is built by Microsoft research, and are a fine-tuned version of Meta's Llama 2 models. The model is designed to excel particularly in reasoning."),
|
|
||||||
'wizardlm': _("General use model based on Llama 2."),
|
'wizardlm': _("General use model based on Llama 2."),
|
||||||
|
'starling-lm': _("Starling is a large language model trained by reinforcement learning from AI feedback focused on improving chatbot helpfulness."),
|
||||||
|
'codegeex4': _("A versatile model for AI software development scenarios, including code completion."),
|
||||||
|
'snowflake-arctic-embed': _("A suite of text embedding models by Snowflake, optimized for performance."),
|
||||||
|
'orca2': _("Orca 2 is built by Microsoft research, and are a fine-tuned version of Meta's Llama 2 models. The model is designed to excel particularly in reasoning."),
|
||||||
'solar': _("A compact, yet powerful 10.7B large language model designed for single-turn conversation."),
|
'solar': _("A compact, yet powerful 10.7B large language model designed for single-turn conversation."),
|
||||||
'samantha-mistral': _("A companion assistant trained in philosophy, psychology, and personal relationships. Based on Mistral."),
|
'samantha-mistral': _("A companion assistant trained in philosophy, psychology, and personal relationships. Based on Mistral."),
|
||||||
'dolphin-phi': _("2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research."),
|
|
||||||
'stable-beluga': _("Llama 2 based model fine tuned on an Orca-style dataset. Originally called Free Willy."),
|
|
||||||
'moondream': _("moondream2 is a small vision language model designed to run efficiently on edge devices."),
|
'moondream': _("moondream2 is a small vision language model designed to run efficiently on edge devices."),
|
||||||
'bakllava': _("BakLLaVA is a multimodal model consisting of the Mistral 7B base model augmented with the LLaVA architecture."),
|
'smollm': _("🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset."),
|
||||||
'wizardlm-uncensored': _("Uncensored version of Wizard LM model"),
|
'stable-beluga': _("🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset."),
|
||||||
'snowflake-arctic-embed': _("A suite of text embedding models by Snowflake, optimized for performance."),
|
'qwen2-math': _("Qwen2 Math is a series of specialized math language models built upon the Qwen2 LLMs, which significantly outperforms the mathematical capabilities of open-source models and even closed-source models (e.g., GPT4o)."),
|
||||||
|
'dolphin-phi': _("2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research."),
|
||||||
'deepseek-v2': _("A strong, economical, and efficient Mixture-of-Experts language model."),
|
'deepseek-v2': _("A strong, economical, and efficient Mixture-of-Experts language model."),
|
||||||
'medllama2': _("Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset."),
|
'bakllava': _("BakLLaVA is a multimodal model consisting of the Mistral 7B base model augmented with the LLaVA architecture."),
|
||||||
'yarn-mistral': _("An extension of Mistral to support context windows of 64K or 128K."),
|
|
||||||
'llama-pro': _("An expansion of Llama 2 that specializes in integrating both general language understanding and domain-specific knowledge, particularly in programming and mathematics."),
|
|
||||||
'nous-hermes2-mixtral': _("The Nous Hermes 2 model from Nous Research, now trained over Mixtral."),
|
|
||||||
'meditron': _("Open-source medical large language model adapted from Llama 2 to the medical domain."),
|
|
||||||
'codeup': _("Great code generation model based on Llama2."),
|
|
||||||
'nexusraven': _("Nexus Raven is a 13B instruction tuned model for function calling tasks."),
|
|
||||||
'everythinglm': _("Uncensored Llama2 based model with support for a 16K context window."),
|
|
||||||
'llava-phi3': _("A new small LLaVA model fine-tuned from Phi 3 Mini."),
|
|
||||||
'codegeex4': _("A versatile model for AI software development scenarios, including code completion."),
|
|
||||||
'glm4': _("A strong multi-lingual general language model with competitive performance to Llama 3."),
|
'glm4': _("A strong multi-lingual general language model with competitive performance to Llama 3."),
|
||||||
|
'wizardlm-uncensored': _("Uncensored version of Wizard LM model"),
|
||||||
|
'yarn-mistral': _("An extension of Mistral to support context windows of 64K or 128K."),
|
||||||
|
'phi3.5': _("A lightweight AI model with 3.8 billion parameters with performance overtaking similarly and larger sized models."),
|
||||||
|
'medllama2': _("Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset."),
|
||||||
|
'llama-pro': _("An expansion of Llama 2 that specializes in integrating both general language understanding and domain-specific knowledge, particularly in programming and mathematics."),
|
||||||
|
'llava-phi3': _("A new small LLaVA model fine-tuned from Phi 3 Mini."),
|
||||||
|
'meditron': _("Open-source medical large language model adapted from Llama 2 to the medical domain."),
|
||||||
|
'nous-hermes2-mixtral': _("The Nous Hermes 2 model from Nous Research, now trained over Mixtral."),
|
||||||
|
'nexusraven': _("Nexus Raven is a 13B instruction tuned model for function calling tasks."),
|
||||||
|
'codeup': _("Great code generation model based on Llama2."),
|
||||||
|
'everythinglm': _("Uncensored Llama2 based model with support for a 16K context window."),
|
||||||
|
'hermes3': _("Hermes 3 is the latest version of the flagship Hermes series of LLMs by Nous Research"),
|
||||||
|
'internlm2': _("InternLM2.5 is a 7B parameter model tailored for practical scenarios with outstanding reasoning capability."),
|
||||||
'magicoder': _("🎩 Magicoder is a family of 7B parameter models trained on 75K synthetic instruction data using OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets."),
|
'magicoder': _("🎩 Magicoder is a family of 7B parameter models trained on 75K synthetic instruction data using OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets."),
|
||||||
'stablelm-zephyr': _("A lightweight chat model allowing accurate, and responsive output without requiring high-end hardware."),
|
'stablelm-zephyr': _("A lightweight chat model allowing accurate, and responsive output without requiring high-end hardware."),
|
||||||
'codebooga': _("A high-performing code instruct model created by merging two existing code models."),
|
'codebooga': _("A high-performing code instruct model created by merging two existing code models."),
|
||||||
'mistrallite': _("MistralLite is a fine-tuned model based on Mistral with enhanced capabilities of processing long contexts."),
|
'mistrallite': _("MistralLite is a fine-tuned model based on Mistral with enhanced capabilities of processing long contexts."),
|
||||||
|
'llama3-groq-tool-use': _("A series of models from Groq that represent a significant advancement in open-source AI capabilities for tool use/function calling."),
|
||||||
|
'falcon2': _("Falcon2 is an 11B parameters causal decoder-only model built by TII and trained over 5T tokens."),
|
||||||
'wizard-vicuna': _("Wizard Vicuna is a 13B parameter model based on Llama 2 trained by MelodysDreamj."),
|
'wizard-vicuna': _("Wizard Vicuna is a 13B parameter model based on Llama 2 trained by MelodysDreamj."),
|
||||||
'duckdb-nsql': _("7B parameter text-to-SQL model made by MotherDuck and Numbers Station."),
|
'duckdb-nsql': _("7B parameter text-to-SQL model made by MotherDuck and Numbers Station."),
|
||||||
'megadolphin': _("MegaDolphin-2.2-120b is a transformation of Dolphin-2.2-70b created by interleaving the model with itself."),
|
'megadolphin': _("MegaDolphin-2.2-120b is a transformation of Dolphin-2.2-70b created by interleaving the model with itself."),
|
||||||
'goliath': _("A language model created by combining two fine-tuned Llama 2 70B models into one."),
|
|
||||||
'notux': _("A top-performing mixture of experts model, fine-tuned with high-quality data."),
|
'notux': _("A top-performing mixture of experts model, fine-tuned with high-quality data."),
|
||||||
|
'goliath': _("A language model created by combining two fine-tuned Llama 2 70B models into one."),
|
||||||
'open-orca-platypus2': _("Merge of the Open Orca OpenChat model and the Garage-bAInd Platypus 2 model. Designed for chat and code generation."),
|
'open-orca-platypus2': _("Merge of the Open Orca OpenChat model and the Garage-bAInd Platypus 2 model. Designed for chat and code generation."),
|
||||||
'falcon2': _("Falcon2 is an 11B parameters causal decoder-only model built by TII and trained over 5T tokens."),
|
|
||||||
'notus': _("A 7B chat model fine-tuned with high-quality data and based on Zephyr."),
|
'notus': _("A 7B chat model fine-tuned with high-quality data and based on Zephyr."),
|
||||||
'dbrx': _("DBRX is an open, general-purpose LLM created by Databricks."),
|
'dbrx': _("DBRX is an open, general-purpose LLM created by Databricks."),
|
||||||
'internlm2': _("InternLM2.5 is a 7B parameter model tailored for practical scenarios with outstanding reasoning capability."),
|
|
||||||
'alfred': _("A robust conversational model designed to be used for both chat and instruct use cases."),
|
|
||||||
'llama3-groq-tool-use': _("A series of models from Groq that represent a significant advancement in open-source AI capabilities for tool use/function calling."),
|
|
||||||
'mathstral': _("MathΣtral: a 7B model designed for math reasoning and scientific discovery by Mistral AI."),
|
'mathstral': _("MathΣtral: a 7B model designed for math reasoning and scientific discovery by Mistral AI."),
|
||||||
|
'bge-m3': _("BGE-M3 is a new model from BAAI distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity."),
|
||||||
|
'alfred': _("A robust conversational model designed to be used for both chat and instruct use cases."),
|
||||||
'firefunction-v2': _("An open weights function calling model based on Llama 3, competitive with GPT-4o function calling capabilities."),
|
'firefunction-v2': _("An open weights function calling model based on Llama 3, competitive with GPT-4o function calling capabilities."),
|
||||||
'nuextract': _("A 3.8B model fine-tuned on a private high-quality synthetic dataset for information extraction, based on Phi-3."),
|
'nuextract': _("A 3.8B model fine-tuned on a private high-quality synthetic dataset for information extraction, based on Phi-3."),
|
||||||
|
'bge-large': _("Embedding model from BAAI mapping texts to vectors."),
|
||||||
|
'paraphrase-multilingual': _("Sentence-transformers model that can be used for tasks like clustering or semantic search."),
|
||||||
}
|
}
|
||||||
@@ -2,33 +2,136 @@
|
|||||||
"""
|
"""
|
||||||
Handles requests to remote and integrated instances of Ollama
|
Handles requests to remote and integrated instances of Ollama
|
||||||
"""
|
"""
|
||||||
import json
|
import json, os, requests, subprocess, threading, shutil
|
||||||
import requests
|
from .internal import data_dir, cache_dir
|
||||||
#OK=200 response.status_code
|
from logging import getLogger
|
||||||
URL = None
|
from time import sleep
|
||||||
BEARER_TOKEN = None
|
|
||||||
|
|
||||||
def get_headers(include_json:bool) -> dict:
|
logger = getLogger(__name__)
|
||||||
headers = {}
|
|
||||||
if include_json:
|
|
||||||
headers["Content-Type"] = "application/json"
|
|
||||||
if BEARER_TOKEN:
|
|
||||||
headers["Authorization"] = "Bearer {}".format(BEARER_TOKEN)
|
|
||||||
return headers if len(headers.keys()) > 0 else None
|
|
||||||
|
|
||||||
def simple_get(connection_url:str) -> dict:
|
window = None
|
||||||
return requests.get(connection_url, headers=get_headers(False))
|
|
||||||
|
|
||||||
def simple_post(connection_url:str, data) -> dict:
|
def log_output(pipe):
|
||||||
return requests.post(connection_url, headers=get_headers(True), data=data, stream=False)
|
with open(os.path.join(data_dir, 'tmp.log'), 'a') as f:
|
||||||
|
with pipe:
|
||||||
|
try:
|
||||||
|
for line in iter(pipe.readline, ''):
|
||||||
|
print(line, end='')
|
||||||
|
f.write(line)
|
||||||
|
f.flush()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
def simple_delete(connection_url:str, data) -> dict:
|
class instance():
|
||||||
return requests.delete(connection_url, headers=get_headers(False), json=data)
|
|
||||||
|
|
||||||
def stream_post(connection_url:str, data, callback:callable) -> dict:
|
def __init__(self, local_port:int, remote_url:str, remote:bool, tweaks:dict, overrides:dict, bearer_token:str, idle_timer_delay:int):
|
||||||
response = requests.post(connection_url, headers=get_headers(True), data=data, stream=True)
|
self.local_port=local_port
|
||||||
if response.status_code == 200:
|
self.remote_url=remote_url
|
||||||
for line in response.iter_lines():
|
self.remote=remote
|
||||||
if line:
|
self.tweaks=tweaks
|
||||||
callback(json.loads(line.decode("utf-8")))
|
self.overrides=overrides
|
||||||
return response
|
self.bearer_token=bearer_token
|
||||||
|
self.idle_timer_delay=idle_timer_delay
|
||||||
|
self.idle_timer_stop_event=threading.Event()
|
||||||
|
self.idle_timer=None
|
||||||
|
self.instance=None
|
||||||
|
self.busy=0
|
||||||
|
if not self.remote:
|
||||||
|
self.start()
|
||||||
|
|
||||||
|
def get_headers(self, include_json:bool) -> dict:
|
||||||
|
headers = {}
|
||||||
|
if include_json:
|
||||||
|
headers["Content-Type"] = "application/json"
|
||||||
|
if self.bearer_token and self.remote:
|
||||||
|
headers["Authorization"] = "Bearer " + self.bearer_token
|
||||||
|
return headers if len(headers.keys()) > 0 else None
|
||||||
|
|
||||||
|
def request(self, connection_type:str, connection_url:str, data:dict=None, callback:callable=None) -> requests.models.Response:
|
||||||
|
self.busy += 1
|
||||||
|
if self.idle_timer and not self.remote:
|
||||||
|
self.idle_timer_stop_event.set()
|
||||||
|
self.idle_timer=None
|
||||||
|
if not self.instance and not self.remote:
|
||||||
|
self.start()
|
||||||
|
connection_url = '{}/{}'.format(self.remote_url if self.remote else 'http://127.0.0.1:{}'.format(self.local_port), connection_url)
|
||||||
|
logger.info('{} : {}'.format(connection_type, connection_url))
|
||||||
|
response = None
|
||||||
|
match connection_type:
|
||||||
|
case "GET":
|
||||||
|
response = requests.get(connection_url, headers=self.get_headers(False))
|
||||||
|
case "POST":
|
||||||
|
if callback:
|
||||||
|
response = requests.post(connection_url, headers=self.get_headers(True), data=data, stream=True)
|
||||||
|
if response.status_code == 200:
|
||||||
|
for line in response.iter_lines():
|
||||||
|
if line:
|
||||||
|
callback(json.loads(line.decode("utf-8")))
|
||||||
|
else:
|
||||||
|
response = requests.post(connection_url, headers=self.get_headers(True), data=data, stream=False)
|
||||||
|
case "DELETE":
|
||||||
|
response = requests.delete(connection_url, headers=self.get_headers(False), data=data)
|
||||||
|
self.busy -= 1
|
||||||
|
if not self.idle_timer and not self.remote:
|
||||||
|
self.start_timer()
|
||||||
|
return response
|
||||||
|
|
||||||
|
def run_timer(self):
|
||||||
|
if not self.idle_timer_stop_event.wait(self.idle_timer_delay*60):
|
||||||
|
window.show_toast(_("Ollama instance was shut down due to inactivity"), window.main_overlay)
|
||||||
|
self.stop()
|
||||||
|
|
||||||
|
def start_timer(self):
|
||||||
|
if self.busy == 0:
|
||||||
|
if self.idle_timer:
|
||||||
|
self.idle_timer_stop_event.set()
|
||||||
|
self.idle_timer=None
|
||||||
|
if self.idle_timer_delay > 0 and self.busy == 0:
|
||||||
|
self.idle_timer_stop_event.clear()
|
||||||
|
self.idle_timer = threading.Thread(target=self.run_timer)
|
||||||
|
self.idle_timer.start()
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
if shutil.which('ollama'):
|
||||||
|
if not os.path.isdir(os.path.join(cache_dir, 'tmp/ollama')):
|
||||||
|
os.mkdir(os.path.join(cache_dir, 'tmp/ollama'))
|
||||||
|
self.instance = None
|
||||||
|
params = self.overrides.copy()
|
||||||
|
params["OLLAMA_DEBUG"] = "1"
|
||||||
|
params["OLLAMA_HOST"] = f"127.0.0.1:{self.local_port}" # You can't change this directly sorry :3
|
||||||
|
params["HOME"] = data_dir
|
||||||
|
params["TMPDIR"] = os.path.join(cache_dir, 'tmp/ollama')
|
||||||
|
instance = subprocess.Popen(["ollama", "serve"], env={**os.environ, **params}, stderr=subprocess.PIPE, stdout=subprocess.PIPE, text=True)
|
||||||
|
threading.Thread(target=log_output, args=(instance.stdout,)).start()
|
||||||
|
threading.Thread(target=log_output, args=(instance.stderr,)).start()
|
||||||
|
logger.info("Starting Alpaca's Ollama instance...")
|
||||||
|
logger.debug(params)
|
||||||
|
logger.info("Started Alpaca's Ollama instance")
|
||||||
|
v_str = subprocess.check_output("ollama -v", shell=True).decode('utf-8')
|
||||||
|
logger.info('Ollama version: {}'.format(v_str.split('client version is ')[1].strip()))
|
||||||
|
self.instance = instance
|
||||||
|
if not self.idle_timer:
|
||||||
|
self.start_timer()
|
||||||
|
else:
|
||||||
|
self.remote = True
|
||||||
|
if not self.remote_url:
|
||||||
|
window.remote_connection_entry.set_text('http://0.0.0.0:11434')
|
||||||
|
window.remote_connection_switch.set_sensitive(True)
|
||||||
|
window.remote_connection_switch.set_active(True)
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
if self.idle_timer:
|
||||||
|
self.idle_timer_stop_event.set()
|
||||||
|
self.idle_timer=None
|
||||||
|
if self.instance:
|
||||||
|
logger.info("Stopping Alpaca's Ollama instance")
|
||||||
|
self.instance.terminate()
|
||||||
|
self.instance.wait()
|
||||||
|
self.instance = None
|
||||||
|
logger.info("Stopped Alpaca's Ollama instance")
|
||||||
|
|
||||||
|
def reset(self):
|
||||||
|
logger.info("Resetting Alpaca's Ollama instance")
|
||||||
|
self.stop()
|
||||||
|
sleep(1)
|
||||||
|
self.start()
|
||||||
|
|||||||
446
src/custom_widgets/chat_widget.py
Normal file
446
src/custom_widgets/chat_widget.py
Normal file
@@ -0,0 +1,446 @@
|
|||||||
|
#chat_widget.py
|
||||||
|
"""
|
||||||
|
Handles the chat widget (testing)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import gi
|
||||||
|
gi.require_version('Gtk', '4.0')
|
||||||
|
gi.require_version('GtkSource', '5')
|
||||||
|
from gi.repository import Gtk, Gio, Adw, Gdk
|
||||||
|
import logging, os, datetime, shutil, random, tempfile, tarfile, json
|
||||||
|
from ..internal import data_dir
|
||||||
|
from .message_widget import message
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
window = None
|
||||||
|
|
||||||
|
possible_prompts = [
|
||||||
|
"What can you do?",
|
||||||
|
"Give me a pancake recipe",
|
||||||
|
"Why is the sky blue?",
|
||||||
|
"Can you tell me a joke?",
|
||||||
|
"Give me a healthy breakfast recipe",
|
||||||
|
"How to make a pizza",
|
||||||
|
"Can you write a poem?",
|
||||||
|
"Can you write a story?",
|
||||||
|
"What is GNU-Linux?",
|
||||||
|
"Which is the best Linux distro?",
|
||||||
|
"Why is Pluto not a planet?",
|
||||||
|
"What is a black-hole?",
|
||||||
|
"Tell me how to stay fit",
|
||||||
|
"Write a conversation between sun and Earth",
|
||||||
|
"Why is the grass green?",
|
||||||
|
"Write an Haïku about AI",
|
||||||
|
"What is the meaning of life?",
|
||||||
|
"Explain quantum physics in simple terms",
|
||||||
|
"Explain the theory of relativity",
|
||||||
|
"Explain how photosynthesis works",
|
||||||
|
"Recommend a film about nature",
|
||||||
|
"What is nostalgia?"
|
||||||
|
]
|
||||||
|
|
||||||
|
class chat(Gtk.ScrolledWindow):
|
||||||
|
__gtype_name__ = 'AlpacaChat'
|
||||||
|
|
||||||
|
def __init__(self, name:str):
|
||||||
|
self.container = Gtk.Box(
|
||||||
|
orientation=1,
|
||||||
|
hexpand=True,
|
||||||
|
vexpand=True,
|
||||||
|
spacing=12,
|
||||||
|
margin_top=12,
|
||||||
|
margin_bottom=12,
|
||||||
|
margin_start=12,
|
||||||
|
margin_end=12
|
||||||
|
)
|
||||||
|
self.clamp = Adw.Clamp(
|
||||||
|
maximum_size=1000,
|
||||||
|
tightening_threshold=800,
|
||||||
|
child=self.container
|
||||||
|
)
|
||||||
|
super().__init__(
|
||||||
|
child=self.clamp,
|
||||||
|
propagate_natural_height=True,
|
||||||
|
kinetic_scrolling=True,
|
||||||
|
vexpand=True,
|
||||||
|
hexpand=True,
|
||||||
|
css_classes=["undershoot-bottom"],
|
||||||
|
name=name
|
||||||
|
)
|
||||||
|
self.messages = {}
|
||||||
|
self.welcome_screen = None
|
||||||
|
self.regenerate_button = None
|
||||||
|
self.busy = False
|
||||||
|
self.get_vadjustment().connect('notify::page-size', lambda va, *_: va.set_value(va.get_upper() - va.get_page_size()) if va.get_value() == 0 else None)
|
||||||
|
|
||||||
|
def stop_message(self):
|
||||||
|
self.busy = False
|
||||||
|
window.switch_send_stop_button(True)
|
||||||
|
|
||||||
|
def clear_chat(self):
|
||||||
|
if self.busy:
|
||||||
|
self.stop_message()
|
||||||
|
self.message = {}
|
||||||
|
self.stop_message()
|
||||||
|
for widget in list(self.container):
|
||||||
|
self.container.remove(widget)
|
||||||
|
|
||||||
|
def add_message(self, message_id:str, model:str=None):
|
||||||
|
msg = message(message_id, model)
|
||||||
|
self.messages[message_id] = msg
|
||||||
|
self.container.append(msg)
|
||||||
|
|
||||||
|
def send_sample_prompt(self, prompt):
|
||||||
|
buffer = window.message_text_view.get_buffer()
|
||||||
|
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
|
||||||
|
buffer.insert(buffer.get_start_iter(), prompt, len(prompt.encode('utf-8')))
|
||||||
|
window.send_message()
|
||||||
|
|
||||||
|
def show_welcome_screen(self, show_prompts:bool):
|
||||||
|
if self.welcome_screen:
|
||||||
|
self.container.remove(self.welcome_screen)
|
||||||
|
self.welcome_screen = None
|
||||||
|
self.clear_chat()
|
||||||
|
button_container = Gtk.Box(
|
||||||
|
orientation=1,
|
||||||
|
spacing=10,
|
||||||
|
halign=3
|
||||||
|
)
|
||||||
|
if show_prompts:
|
||||||
|
for prompt in random.sample(possible_prompts, 3):
|
||||||
|
prompt_button = Gtk.Button(
|
||||||
|
label=prompt,
|
||||||
|
tooltip_text=_("Send prompt: '{}'").format(prompt)
|
||||||
|
)
|
||||||
|
prompt_button.connect('clicked', lambda *_, prompt=prompt : self.send_sample_prompt(prompt))
|
||||||
|
button_container.append(prompt_button)
|
||||||
|
else:
|
||||||
|
button = Gtk.Button(
|
||||||
|
label=_("Open Model Manager"),
|
||||||
|
tooltip_text=_("Open Model Manager"),
|
||||||
|
css_classes=["suggested-action", "pill"]
|
||||||
|
)
|
||||||
|
button.connect('clicked', lambda *_ : window.manage_models_dialog.present(window))
|
||||||
|
button_container.append(button)
|
||||||
|
|
||||||
|
self.welcome_screen = Adw.StatusPage(
|
||||||
|
icon_name="com.jeffser.Alpaca",
|
||||||
|
title="Alpaca",
|
||||||
|
description=_("Try one of these prompts") if show_prompts else _("It looks like you don't have any models downloaded yet. Download models to get started!"),
|
||||||
|
child=button_container,
|
||||||
|
vexpand=True
|
||||||
|
)
|
||||||
|
|
||||||
|
self.container.append(self.welcome_screen)
|
||||||
|
|
||||||
|
def load_chat_messages(self, messages:dict):
|
||||||
|
if len(messages.keys()) > 0:
|
||||||
|
if self.welcome_screen:
|
||||||
|
self.container.remove(self.welcome_screen)
|
||||||
|
self.welcome_screen = None
|
||||||
|
for message_id, message_data in messages.items():
|
||||||
|
if message_data['content']:
|
||||||
|
self.add_message(message_id, message_data['model'] if message_data['role'] == 'assistant' else None)
|
||||||
|
message_element = self.messages[message_id]
|
||||||
|
if 'images' in message_data:
|
||||||
|
images=[]
|
||||||
|
for image in message_data['images']:
|
||||||
|
images.append(os.path.join(data_dir, "chats", self.get_name(), message_id, image))
|
||||||
|
message_element.add_images(images)
|
||||||
|
if 'files' in message_data:
|
||||||
|
files={}
|
||||||
|
for file_name, file_type in message_data['files'].items():
|
||||||
|
files[os.path.join(data_dir, "chats", self.get_name(), message_id, file_name)] = file_type
|
||||||
|
message_element.add_attachments(files)
|
||||||
|
message_element.set_text(message_data['content'])
|
||||||
|
message_element.add_footer(datetime.datetime.strptime(message_data['date'] + (":00" if message_data['date'].count(":") == 1 else ""), '%Y/%m/%d %H:%M:%S'))
|
||||||
|
else:
|
||||||
|
self.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
|
||||||
|
|
||||||
|
def messages_to_dict(self) -> dict:
|
||||||
|
messages_dict = {}
|
||||||
|
for message_id, message_element in self.messages.items():
|
||||||
|
if message_element.text and message_element.dt:
|
||||||
|
messages_dict[message_id] = {
|
||||||
|
'role': 'assistant' if message_element.bot else 'user',
|
||||||
|
'model': message_element.model,
|
||||||
|
'date': message_element.dt.strftime("%Y/%m/%d %H:%M:%S"),
|
||||||
|
'content': message_element.text
|
||||||
|
}
|
||||||
|
|
||||||
|
if message_element.image_c:
|
||||||
|
images = []
|
||||||
|
for file in message_element.image_c.files:
|
||||||
|
images.append(file.image_name)
|
||||||
|
messages_dict[message_id]['images'] = images
|
||||||
|
|
||||||
|
if message_element.attachment_c:
|
||||||
|
files = {}
|
||||||
|
for file in message_element.attachment_c.files:
|
||||||
|
files[file.file_name] = file.file_type
|
||||||
|
messages_dict[message_id]['files'] = files
|
||||||
|
return messages_dict
|
||||||
|
|
||||||
|
def show_regenerate_button(self, msg:message):
|
||||||
|
if self.regenerate_button:
|
||||||
|
self.remove(self.regenerate_button)
|
||||||
|
self.regenerate_button = Gtk.Button(
|
||||||
|
child=Adw.ButtonContent(
|
||||||
|
icon_name='update-symbolic',
|
||||||
|
label=_('Regenerate Response')
|
||||||
|
),
|
||||||
|
css_classes=["suggested-action"],
|
||||||
|
halign=3
|
||||||
|
)
|
||||||
|
self.regenerate_button.connect('clicked', lambda *_: msg.action_buttons.regenerate_message())
|
||||||
|
self.container.append(self.regenerate_button)
|
||||||
|
|
||||||
|
class chat_tab(Gtk.ListBoxRow):
|
||||||
|
__gtype_name__ = 'AlpacaChatTab'
|
||||||
|
|
||||||
|
def __init__(self, chat_window:chat):
|
||||||
|
self.chat_window=chat_window
|
||||||
|
self.spinner = Gtk.Spinner(
|
||||||
|
spinning=True,
|
||||||
|
visible=False
|
||||||
|
)
|
||||||
|
self.label = Gtk.Label(
|
||||||
|
label=self.chat_window.get_name(),
|
||||||
|
tooltip_text=self.chat_window.get_name(),
|
||||||
|
hexpand=True,
|
||||||
|
halign=0,
|
||||||
|
wrap=True,
|
||||||
|
ellipsize=3,
|
||||||
|
wrap_mode=2,
|
||||||
|
xalign=0
|
||||||
|
)
|
||||||
|
self.indicator = Gtk.Image.new_from_icon_name("chat-bubble-text-symbolic")
|
||||||
|
self.indicator.set_visible(False)
|
||||||
|
self.indicator.set_css_classes(['accent'])
|
||||||
|
container = Gtk.Box(
|
||||||
|
orientation=0,
|
||||||
|
spacing=5
|
||||||
|
)
|
||||||
|
container.append(self.label)
|
||||||
|
container.append(self.spinner)
|
||||||
|
container.append(self.indicator)
|
||||||
|
super().__init__(
|
||||||
|
css_classes = ["chat_row"],
|
||||||
|
height_request = 45,
|
||||||
|
child = container
|
||||||
|
)
|
||||||
|
|
||||||
|
self.gesture = Gtk.GestureClick(button=3)
|
||||||
|
self.gesture.connect("released", self.chat_click_handler)
|
||||||
|
self.add_controller(self.gesture)
|
||||||
|
|
||||||
|
def chat_click_handler(self, gesture, n_press, x, y):
|
||||||
|
chat_row = gesture.get_widget()
|
||||||
|
popover = Gtk.PopoverMenu(
|
||||||
|
menu_model=window.chat_right_click_menu,
|
||||||
|
has_arrow=False,
|
||||||
|
halign=1,
|
||||||
|
height_request=155
|
||||||
|
)
|
||||||
|
window.selected_chat_row = chat_row
|
||||||
|
position = Gdk.Rectangle()
|
||||||
|
position.x = x
|
||||||
|
position.y = y
|
||||||
|
popover.set_parent(chat_row.get_child())
|
||||||
|
popover.set_pointing_to(position)
|
||||||
|
popover.popup()
|
||||||
|
|
||||||
|
class chat_list(Gtk.ListBox):
|
||||||
|
__gtype_name__ = 'AlpacaChatList'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__(
|
||||||
|
selection_mode=1,
|
||||||
|
css_classes=["navigation-sidebar"]
|
||||||
|
)
|
||||||
|
self.connect("row-selected", lambda listbox, row: self.chat_changed(row))
|
||||||
|
self.tab_list = []
|
||||||
|
|
||||||
|
def update_welcome_screens(self, show_prompts:bool):
|
||||||
|
for tab in self.tab_list:
|
||||||
|
if tab.chat_window.welcome_screen:
|
||||||
|
tab.chat_window.show_welcome_screen(show_prompts)
|
||||||
|
|
||||||
|
def get_tab_by_name(self, chat_name:str) -> chat_tab:
|
||||||
|
for tab in self.tab_list:
|
||||||
|
if tab.chat_window.get_name() == chat_name:
|
||||||
|
return tab
|
||||||
|
|
||||||
|
def get_chat_by_name(self, chat_name:str) -> chat:
|
||||||
|
tab = self.get_tab_by_name(chat_name)
|
||||||
|
if tab:
|
||||||
|
return tab.chat_window
|
||||||
|
|
||||||
|
def get_current_chat(self) -> chat:
|
||||||
|
row = self.get_selected_row()
|
||||||
|
if row:
|
||||||
|
return self.get_selected_row().chat_window
|
||||||
|
|
||||||
|
def send_tab_to_top(self, tab:chat_tab):
|
||||||
|
self.unselect_all()
|
||||||
|
self.tab_list.remove(tab)
|
||||||
|
self.tab_list.insert(0, tab)
|
||||||
|
self.remove(tab)
|
||||||
|
self.prepend(tab)
|
||||||
|
self.select_row(tab)
|
||||||
|
|
||||||
|
def append_chat(self, chat_name:str) -> chat:
|
||||||
|
chat_name = window.generate_numbered_name(chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
|
||||||
|
chat_window = chat(chat_name)
|
||||||
|
tab = chat_tab(chat_window)
|
||||||
|
self.append(tab)
|
||||||
|
self.tab_list.append(tab)
|
||||||
|
window.chat_stack.add_child(chat_window)
|
||||||
|
return chat_window
|
||||||
|
|
||||||
|
def prepend_chat(self, chat_name:str) -> chat:
|
||||||
|
chat_name = window.generate_numbered_name(chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
|
||||||
|
chat_window = chat(chat_name)
|
||||||
|
tab = chat_tab(chat_window)
|
||||||
|
self.prepend(tab)
|
||||||
|
self.tab_list.insert(0, tab)
|
||||||
|
chat_window.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
|
||||||
|
window.chat_stack.add_child(chat_window)
|
||||||
|
window.chat_list_box.select_row(tab)
|
||||||
|
return chat_window
|
||||||
|
|
||||||
|
def new_chat(self):
|
||||||
|
window.save_history(self.prepend_chat(_("New Chat")))
|
||||||
|
|
||||||
|
def delete_chat(self, chat_name:str):
|
||||||
|
chat_tab = None
|
||||||
|
for c in self.tab_list:
|
||||||
|
if c.chat_window.get_name() == chat_name:
|
||||||
|
chat_tab = c
|
||||||
|
if chat_tab:
|
||||||
|
chat_tab.chat_window.stop_message()
|
||||||
|
window.chat_stack.remove(chat_tab.chat_window)
|
||||||
|
self.tab_list.remove(chat_tab)
|
||||||
|
self.remove(chat_tab)
|
||||||
|
if os.path.exists(os.path.join(data_dir, "chats", chat_name)):
|
||||||
|
shutil.rmtree(os.path.join(data_dir, "chats", chat_name))
|
||||||
|
if len(self.tab_list) == 0:
|
||||||
|
self.new_chat()
|
||||||
|
if not self.get_current_chat() or self.get_current_chat() == chat_tab.chat_window:
|
||||||
|
self.select_row(self.get_row_at_index(0))
|
||||||
|
window.save_history()
|
||||||
|
|
||||||
|
def rename_chat(self, old_chat_name:str, new_chat_name:str):
|
||||||
|
tab = self.get_tab_by_name(old_chat_name)
|
||||||
|
if tab:
|
||||||
|
new_chat_name = window.generate_numbered_name(new_chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
|
||||||
|
tab.label.set_label(new_chat_name)
|
||||||
|
tab.label.set_tooltip_text(new_chat_name)
|
||||||
|
tab.chat_window.set_name(new_chat_name)
|
||||||
|
if os.path.exists(os.path.join(data_dir, "chats", old_chat_name)):
|
||||||
|
shutil.move(os.path.join(data_dir, "chats", old_chat_name), os.path.join(data_dir, "chats", new_chat_name))
|
||||||
|
window.save_history(tab.chat_window)
|
||||||
|
|
||||||
|
def duplicate_chat(self, chat_name:str):
|
||||||
|
new_chat_name = window.generate_numbered_name(_("Copy of {}").format(chat_name), [tab.chat_window.get_name() for tab in self.tab_list])
|
||||||
|
try:
|
||||||
|
shutil.copytree(os.path.join(data_dir, "chats", chat_name), os.path.join(data_dir, "chats", new_chat_name))
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(e)
|
||||||
|
self.prepend_chat(new_chat_name)
|
||||||
|
created_chat = self.get_tab_by_name(new_chat_name).chat_window
|
||||||
|
created_chat.load_chat_messages(self.get_tab_by_name(chat_name).chat_window.messages_to_dict())
|
||||||
|
window.save_history(created_chat)
|
||||||
|
|
||||||
|
def on_replace_contents(self, file, result):
|
||||||
|
file.replace_contents_finish(result)
|
||||||
|
window.show_toast(_("Chat exported successfully"), window.main_overlay)
|
||||||
|
|
||||||
|
def on_export_chat(self, file_dialog, result, chat_name):
|
||||||
|
file = file_dialog.save_finish(result)
|
||||||
|
if not file:
|
||||||
|
return
|
||||||
|
json_data = json.dumps({chat_name: {"messages": self.get_chat_by_name(chat_name).messages_to_dict()}}, indent=4).encode("UTF-8")
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory() as temp_dir:
|
||||||
|
json_path = os.path.join(temp_dir, "data.json")
|
||||||
|
with open(json_path, "wb") as json_file:
|
||||||
|
json_file.write(json_data)
|
||||||
|
|
||||||
|
tar_path = os.path.join(temp_dir, chat_name)
|
||||||
|
with tarfile.open(tar_path, "w") as tar:
|
||||||
|
tar.add(json_path, arcname="data.json")
|
||||||
|
directory = os.path.join(data_dir, "chats", chat_name)
|
||||||
|
if os.path.exists(directory) and os.path.isdir(directory):
|
||||||
|
tar.add(directory, arcname=os.path.basename(directory))
|
||||||
|
|
||||||
|
with open(tar_path, "rb") as tar:
|
||||||
|
tar_content = tar.read()
|
||||||
|
|
||||||
|
file.replace_contents_async(
|
||||||
|
tar_content,
|
||||||
|
etag=None,
|
||||||
|
make_backup=False,
|
||||||
|
flags=Gio.FileCreateFlags.NONE,
|
||||||
|
cancellable=None,
|
||||||
|
callback=self.on_replace_contents
|
||||||
|
)
|
||||||
|
|
||||||
|
def export_chat(self, chat_name:str):
|
||||||
|
logger.info("Exporting chat")
|
||||||
|
file_dialog = Gtk.FileDialog(initial_name=f"{chat_name}.tar")
|
||||||
|
file_dialog.save(parent=window, cancellable=None, callback=lambda file_dialog, result, chat_name=chat_name: self.on_export_chat(file_dialog, result, chat_name))
|
||||||
|
|
||||||
|
def on_chat_imported(self, file_dialog, result):
|
||||||
|
file = file_dialog.open_finish(result)
|
||||||
|
if not file:
|
||||||
|
return
|
||||||
|
stream = file.read(None)
|
||||||
|
data_stream = Gio.DataInputStream.new(stream)
|
||||||
|
tar_content = data_stream.read_bytes(1024 * 1024, None)
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory() as temp_dir:
|
||||||
|
tar_filename = os.path.join(temp_dir, "imported_chat.tar")
|
||||||
|
|
||||||
|
with open(tar_filename, "wb") as tar_file:
|
||||||
|
tar_file.write(tar_content.get_data())
|
||||||
|
|
||||||
|
with tarfile.open(tar_filename, "r") as tar:
|
||||||
|
tar.extractall(path=temp_dir)
|
||||||
|
chat_name = None
|
||||||
|
chat_content = None
|
||||||
|
for member in tar.getmembers():
|
||||||
|
if member.name == "data.json":
|
||||||
|
json_filepath = os.path.join(temp_dir, member.name)
|
||||||
|
with open(json_filepath, "r", encoding="utf-8") as json_file:
|
||||||
|
data = json.load(json_file)
|
||||||
|
for chat_name, chat_content in data.items():
|
||||||
|
new_chat_name = window.generate_numbered_name(chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
|
||||||
|
src_path = os.path.join(temp_dir, chat_name)
|
||||||
|
dest_path = os.path.join(data_dir, "chats", new_chat_name)
|
||||||
|
if os.path.exists(src_path) and os.path.isdir(src_path) and not os.path.exists(dest_path):
|
||||||
|
shutil.copytree(src_path, dest_path)
|
||||||
|
|
||||||
|
created_chat = self.prepend_chat(new_chat_name)
|
||||||
|
created_chat.load_chat_messages(chat_content['messages'])
|
||||||
|
window.save_history(created_chat)
|
||||||
|
window.show_toast(_("Chat imported successfully"), window.main_overlay)
|
||||||
|
|
||||||
|
def import_chat(self):
|
||||||
|
logger.info("Importing chat")
|
||||||
|
file_dialog = Gtk.FileDialog(default_filter=window.file_filter_tar)
|
||||||
|
file_dialog.open(window, None, self.on_chat_imported)
|
||||||
|
|
||||||
|
def chat_changed(self, row):
|
||||||
|
if row:
|
||||||
|
current_tab_i = next((i for i, t in enumerate(self.tab_list) if t.chat_window == window.chat_stack.get_visible_child()), -1)
|
||||||
|
if self.tab_list.index(row) != current_tab_i:
|
||||||
|
window.chat_stack.set_transition_type(4 if self.tab_list.index(row) > current_tab_i else 5)
|
||||||
|
window.chat_stack.set_visible_child(row.chat_window)
|
||||||
|
window.switch_send_stop_button(not row.chat_window.busy)
|
||||||
|
if len(row.chat_window.messages) > 0:
|
||||||
|
last_model_used = row.chat_window.messages[list(row.chat_window.messages)[-1]].model
|
||||||
|
window.model_manager.change_model(last_model_used)
|
||||||
|
if row.indicator.get_visible():
|
||||||
|
row.indicator.set_visible(False)
|
||||||
545
src/custom_widgets/message_widget.py
Normal file
545
src/custom_widgets/message_widget.py
Normal file
@@ -0,0 +1,545 @@
|
|||||||
|
#message_widget.py
|
||||||
|
"""
|
||||||
|
Handles the message widget (testing)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import gi
|
||||||
|
gi.require_version('Gtk', '4.0')
|
||||||
|
gi.require_version('GtkSource', '5')
|
||||||
|
from gi.repository import Gtk, GObject, Gio, Adw, GtkSource, GLib, Gdk
|
||||||
|
import logging, os, datetime, re, shutil, threading, sys
|
||||||
|
from ..internal import config_dir, data_dir, cache_dir, source_dir
|
||||||
|
from .table_widget import TableWidget
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
window = None
|
||||||
|
|
||||||
|
class edit_text_block(Gtk.TextView):
|
||||||
|
__gtype_name__ = 'AlpacaEditTextBlock'
|
||||||
|
|
||||||
|
def __init__(self, text:str):
|
||||||
|
super().__init__(
|
||||||
|
hexpand=True,
|
||||||
|
halign=0,
|
||||||
|
margin_top=5,
|
||||||
|
margin_bottom=5,
|
||||||
|
margin_start=5,
|
||||||
|
margin_end=5,
|
||||||
|
css_classes=["view", "editing_message_textview"]
|
||||||
|
)
|
||||||
|
self.get_buffer().insert(self.get_buffer().get_start_iter(), text, len(text.encode('utf-8')))
|
||||||
|
enter_key_controller = Gtk.EventControllerKey.new()
|
||||||
|
enter_key_controller.connect("key-pressed", lambda controller, keyval, keycode, state: self.edit_message() if keyval==Gdk.KEY_Return and not (state & Gdk.ModifierType.SHIFT_MASK) else None)
|
||||||
|
self.add_controller(enter_key_controller)
|
||||||
|
|
||||||
|
def edit_message(self):
|
||||||
|
self.get_parent().get_parent().action_buttons.set_visible(True)
|
||||||
|
self.get_parent().get_parent().set_text(self.get_buffer().get_text(self.get_buffer().get_start_iter(), self.get_buffer().get_end_iter(), False))
|
||||||
|
self.get_parent().get_parent().add_footer(self.get_parent().get_parent().dt)
|
||||||
|
window.save_history(self.get_parent().get_parent().get_parent().get_parent().get_parent().get_parent())
|
||||||
|
self.get_parent().remove(self)
|
||||||
|
window.show_toast(_("Message edited successfully"), window.main_overlay)
|
||||||
|
return True
|
||||||
|
|
||||||
|
class text_block(Gtk.Label):
|
||||||
|
__gtype_name__ = 'AlpacaTextBlock'
|
||||||
|
|
||||||
|
def __init__(self, bot:bool):
|
||||||
|
super().__init__(
|
||||||
|
hexpand=True,
|
||||||
|
halign=0,
|
||||||
|
wrap=True,
|
||||||
|
wrap_mode=0,
|
||||||
|
xalign=0,
|
||||||
|
margin_top=5,
|
||||||
|
margin_start=5,
|
||||||
|
margin_end=5,
|
||||||
|
focusable=True,
|
||||||
|
selectable=True
|
||||||
|
)
|
||||||
|
self.update_property([4, 7], [_("Response message") if bot else _("User message"), False])
|
||||||
|
self.connect('notify::has-focus', lambda *_: None if self.has_focus() else self.remove_selection() )
|
||||||
|
|
||||||
|
def remove_selection(self):
|
||||||
|
self.set_selectable(False)
|
||||||
|
self.set_selectable(True)
|
||||||
|
|
||||||
|
def insert_at_end(self, text:str, markdown:bool):
|
||||||
|
if markdown:
|
||||||
|
self.set_markup(self.get_text() + text)
|
||||||
|
else:
|
||||||
|
self.set_text(self.get_text() + text)
|
||||||
|
self.update_property([1], [self.get_text()])
|
||||||
|
|
||||||
|
def clear_text(self):
|
||||||
|
self.buffer.delete(self.textbuffer.get_start_iter(), self.textbuffer.get_end_iter())
|
||||||
|
self.update_property([1], [""])
|
||||||
|
|
||||||
|
class code_block(Gtk.Box):
|
||||||
|
__gtype_name__ = 'AlpacaCodeBlock'
|
||||||
|
|
||||||
|
def __init__(self, text:str, language_name:str=None):
|
||||||
|
super().__init__(
|
||||||
|
css_classes=["card", "code_block"],
|
||||||
|
orientation=1,
|
||||||
|
overflow=1,
|
||||||
|
margin_start=5,
|
||||||
|
margin_end=5
|
||||||
|
)
|
||||||
|
|
||||||
|
self.language = None
|
||||||
|
if language_name:
|
||||||
|
self.language = GtkSource.LanguageManager.get_default().get_language(language_name)
|
||||||
|
if self.language:
|
||||||
|
self.buffer = GtkSource.Buffer.new_with_language(self.language)
|
||||||
|
else:
|
||||||
|
self.buffer = GtkSource.Buffer()
|
||||||
|
self.buffer.set_style_scheme(GtkSource.StyleSchemeManager.get_default().get_scheme('Adwaita-dark'))
|
||||||
|
self.source_view = GtkSource.View(
|
||||||
|
auto_indent=True, indent_width=4, buffer=self.buffer, show_line_numbers=True, editable=None,
|
||||||
|
top_margin=6, bottom_margin=6, left_margin=12, right_margin=12, css_classes=["code_block"]
|
||||||
|
)
|
||||||
|
self.source_view.update_property([4], [_("{}Code Block").format('{} '.format(self.language.get_name()) if self.language else "")])
|
||||||
|
|
||||||
|
title_box = Gtk.Box(margin_start=12, margin_top=3, margin_bottom=3, margin_end=3)
|
||||||
|
title_box.append(Gtk.Label(label=self.language.get_name() if self.language else _("Code Block"), hexpand=True, xalign=0))
|
||||||
|
copy_button = Gtk.Button(icon_name="edit-copy-symbolic", css_classes=["flat", "circular"], tooltip_text=_("Copy Message"))
|
||||||
|
copy_button.connect("clicked", lambda *_: self.on_copy())
|
||||||
|
title_box.append(copy_button)
|
||||||
|
self.append(title_box)
|
||||||
|
self.append(Gtk.Separator())
|
||||||
|
self.append(self.source_view)
|
||||||
|
self.buffer.set_text(text)
|
||||||
|
|
||||||
|
def on_copy(self):
|
||||||
|
logger.debug("Copying code")
|
||||||
|
clipboard = Gdk.Display().get_default().get_clipboard()
|
||||||
|
start = self.buffer.get_start_iter()
|
||||||
|
end = self.buffer.get_end_iter()
|
||||||
|
text = self.buffer.get_text(start, end, False)
|
||||||
|
clipboard.set(text)
|
||||||
|
window.show_toast(_("Code copied to the clipboard"), window.main_overlay)
|
||||||
|
|
||||||
|
class attachment(Gtk.Button):
|
||||||
|
__gtype_name__ = 'AlpacaAttachment'
|
||||||
|
|
||||||
|
def __init__(self, file_name:str, file_path:str, file_type:str):
|
||||||
|
self.file_name = file_name
|
||||||
|
self.file_path = file_path
|
||||||
|
self.file_type = file_type
|
||||||
|
|
||||||
|
directory, file_name = os.path.split(self.file_path)
|
||||||
|
head, last_dir = os.path.split(directory)
|
||||||
|
head, second_last_dir = os.path.split(head)
|
||||||
|
self.file_path = os.path.join(head, '{selected_chat}', last_dir, file_name)
|
||||||
|
|
||||||
|
button_content = Adw.ButtonContent(
|
||||||
|
label=self.file_name,
|
||||||
|
icon_name={
|
||||||
|
"plain_text": "document-text-symbolic",
|
||||||
|
"pdf": "document-text-symbolic",
|
||||||
|
"youtube": "play-symbolic",
|
||||||
|
"website": "globe-symbolic"
|
||||||
|
}[self.file_type]
|
||||||
|
)
|
||||||
|
|
||||||
|
super().__init__(
|
||||||
|
vexpand=False,
|
||||||
|
valign=3,
|
||||||
|
name=self.file_name,
|
||||||
|
css_classes=["flat"],
|
||||||
|
tooltip_text=self.file_name,
|
||||||
|
child=button_content
|
||||||
|
)
|
||||||
|
|
||||||
|
self.connect("clicked", lambda button, file_path=self.file_path, file_type=self.file_type: window.preview_file(file_path, file_type, None))
|
||||||
|
|
||||||
|
class attachment_container(Gtk.ScrolledWindow):
|
||||||
|
__gtype_name__ = 'AlpacaAttachmentContainer'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.files = []
|
||||||
|
|
||||||
|
self.container = Gtk.Box(
|
||||||
|
orientation=0,
|
||||||
|
spacing=12
|
||||||
|
)
|
||||||
|
|
||||||
|
super().__init__(
|
||||||
|
margin_top=10,
|
||||||
|
margin_start=10,
|
||||||
|
margin_end=10,
|
||||||
|
hexpand=True,
|
||||||
|
child=self.container
|
||||||
|
)
|
||||||
|
|
||||||
|
def add_file(self, file:attachment):
|
||||||
|
self.container.append(file)
|
||||||
|
self.files.append(file)
|
||||||
|
|
||||||
|
class image(Gtk.Button):
|
||||||
|
__gtype_name__ = 'AlpacaImage'
|
||||||
|
|
||||||
|
def __init__(self, image_path:str):
|
||||||
|
self.image_path = image_path
|
||||||
|
self.image_name = os.path.basename(self.image_path)
|
||||||
|
|
||||||
|
directory, file_name = os.path.split(self.image_path)
|
||||||
|
head, last_dir = os.path.split(directory)
|
||||||
|
head, second_last_dir = os.path.split(head)
|
||||||
|
|
||||||
|
try:
|
||||||
|
if not os.path.isfile(self.image_path):
|
||||||
|
raise FileNotFoundError("'{}' was not found or is a directory".format(self.image_path))
|
||||||
|
image = Gtk.Image.new_from_file(self.image_path)
|
||||||
|
image.set_size_request(240, 240)
|
||||||
|
super().__init__(
|
||||||
|
child=image,
|
||||||
|
css_classes=["flat", "chat_image_button"],
|
||||||
|
name=self.image_name,
|
||||||
|
tooltip_text=_("Image")
|
||||||
|
)
|
||||||
|
image.update_property([4], [_("Image")])
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(e)
|
||||||
|
image_texture = Gtk.Image.new_from_icon_name("image-missing-symbolic")
|
||||||
|
image_texture.set_icon_size(2)
|
||||||
|
image_texture.set_vexpand(True)
|
||||||
|
image_texture.set_pixel_size(120)
|
||||||
|
image_label = Gtk.Label(
|
||||||
|
label=_("Missing Image"),
|
||||||
|
)
|
||||||
|
image_box = Gtk.Box(
|
||||||
|
spacing=10,
|
||||||
|
orientation=1,
|
||||||
|
margin_top=10,
|
||||||
|
margin_bottom=10,
|
||||||
|
margin_start=10,
|
||||||
|
margin_end=10
|
||||||
|
)
|
||||||
|
image_box.append(image_texture)
|
||||||
|
image_box.append(image_label)
|
||||||
|
image_box.set_size_request(220, 220)
|
||||||
|
super().__init__(
|
||||||
|
child=image_box,
|
||||||
|
css_classes=["flat", "chat_image_button"],
|
||||||
|
tooltip_text=_("Missing Image")
|
||||||
|
)
|
||||||
|
image_texture.update_property([4], [_("Missing image")])
|
||||||
|
self.connect("clicked", lambda button, file_path=os.path.join(head, '{selected_chat}', last_dir, file_name): window.preview_file(file_path, 'image', None))
|
||||||
|
|
||||||
|
class image_container(Gtk.ScrolledWindow):
|
||||||
|
__gtype_name__ = 'AlpacaImageContainer'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.files = []
|
||||||
|
|
||||||
|
self.container = Gtk.Box(
|
||||||
|
orientation=0,
|
||||||
|
spacing=12
|
||||||
|
)
|
||||||
|
|
||||||
|
super().__init__(
|
||||||
|
margin_top=10,
|
||||||
|
margin_start=10,
|
||||||
|
margin_end=10,
|
||||||
|
hexpand=True,
|
||||||
|
height_request = 240,
|
||||||
|
child=self.container
|
||||||
|
)
|
||||||
|
|
||||||
|
def add_image(self, img:image):
|
||||||
|
self.container.append(img)
|
||||||
|
self.files.append(img)
|
||||||
|
|
||||||
|
class footer(Gtk.Label):
|
||||||
|
__gtype_name__ = 'AlpacaMessageFooter'
|
||||||
|
|
||||||
|
def __init__(self, dt:datetime.datetime, model:str=None):
|
||||||
|
super().__init__(
|
||||||
|
hexpand=False,
|
||||||
|
halign=0,
|
||||||
|
wrap=True,
|
||||||
|
ellipsize=3,
|
||||||
|
wrap_mode=2,
|
||||||
|
xalign=0,
|
||||||
|
margin_bottom=5,
|
||||||
|
margin_start=5,
|
||||||
|
focusable=True
|
||||||
|
)
|
||||||
|
self.set_markup("<small>{}{}</small>".format((window.convert_model_name(model, 0) + " • ") if model else "", GLib.markup_escape_text(self.format_datetime(dt))))
|
||||||
|
|
||||||
|
def format_datetime(self, dt:datetime) -> str:
|
||||||
|
date = GLib.DateTime.new(GLib.DateTime.new_now_local().get_timezone(), dt.year, dt.month, dt.day, dt.hour, dt.minute, dt.second)
|
||||||
|
current_date = GLib.DateTime.new_now_local()
|
||||||
|
if date.format("%Y/%m/%d") == current_date.format("%Y/%m/%d"):
|
||||||
|
return date.format("%H:%M %p")
|
||||||
|
if date.format("%Y") == current_date.format("%Y"):
|
||||||
|
return date.format("%b %d, %H:%M %p")
|
||||||
|
return date.format("%b %d %Y, %H:%M %p")
|
||||||
|
|
||||||
|
class action_buttons(Gtk.Box):
|
||||||
|
__gtype_name__ = 'AlpacaActionButtonContainer'
|
||||||
|
|
||||||
|
def __init__(self, bot:bool):
|
||||||
|
super().__init__(
|
||||||
|
orientation=0,
|
||||||
|
spacing=6,
|
||||||
|
margin_end=6,
|
||||||
|
margin_bottom=6,
|
||||||
|
valign="end",
|
||||||
|
halign="end"
|
||||||
|
)
|
||||||
|
|
||||||
|
self.delete_button = Gtk.Button(
|
||||||
|
icon_name = "user-trash-symbolic",
|
||||||
|
css_classes = ["flat", "circular"],
|
||||||
|
tooltip_text = _("Remove Message")
|
||||||
|
)
|
||||||
|
self.delete_button.connect('clicked', lambda *_: self.delete_message())
|
||||||
|
self.append(self.delete_button)
|
||||||
|
|
||||||
|
self.copy_button = Gtk.Button(
|
||||||
|
icon_name = "edit-copy-symbolic",
|
||||||
|
css_classes = ["flat", "circular"],
|
||||||
|
tooltip_text = _("Copy Message")
|
||||||
|
)
|
||||||
|
self.copy_button.connect('clicked', lambda *_: self.copy_message())
|
||||||
|
self.append(self.copy_button)
|
||||||
|
|
||||||
|
self.regenerate_button = Gtk.Button(
|
||||||
|
icon_name = "update-symbolic",
|
||||||
|
css_classes = ["flat", "circular"],
|
||||||
|
tooltip_text = _("Regenerate Message")
|
||||||
|
)
|
||||||
|
self.regenerate_button.connect('clicked', lambda *_: self.regenerate_message())
|
||||||
|
|
||||||
|
self.edit_button = Gtk.Button(
|
||||||
|
icon_name = "edit-symbolic",
|
||||||
|
css_classes = ["flat", "circular"],
|
||||||
|
tooltip_text = _("Edit Message")
|
||||||
|
)
|
||||||
|
self.edit_button.connect('clicked', lambda *_: self.edit_message())
|
||||||
|
|
||||||
|
self.append(self.regenerate_button if bot else self.edit_button)
|
||||||
|
|
||||||
|
def delete_message(self):
|
||||||
|
logger.debug("Deleting message")
|
||||||
|
chat = self.get_parent().get_parent().get_parent().get_parent().get_parent()
|
||||||
|
message_id = self.get_parent().message_id
|
||||||
|
self.get_parent().get_parent().remove(self.get_parent())
|
||||||
|
if os.path.exists(os.path.join(data_dir, "chats", window.chat_list_box.get_current_chat().get_name(), self.get_parent().message_id)):
|
||||||
|
shutil.rmtree(os.path.join(data_dir, "chats", window.chat_list_box.get_current_chat().get_name(), self.get_parent().message_id))
|
||||||
|
del chat.messages[message_id]
|
||||||
|
window.save_history(chat)
|
||||||
|
if len(chat.messages) == 0:
|
||||||
|
chat.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
|
||||||
|
|
||||||
|
def copy_message(self):
|
||||||
|
logger.debug("Copying message")
|
||||||
|
clipboard = Gdk.Display().get_default().get_clipboard()
|
||||||
|
clipboard.set(self.get_parent().text)
|
||||||
|
window.show_toast(_("Message copied to the clipboard"), window.main_overlay)
|
||||||
|
|
||||||
|
def regenerate_message(self):
|
||||||
|
chat = self.get_parent().get_parent().get_parent().get_parent().get_parent()
|
||||||
|
message_element = self.get_parent()
|
||||||
|
if not chat.busy:
|
||||||
|
message_element.set_text()
|
||||||
|
if message_element.footer:
|
||||||
|
message_element.container.remove(message_element.footer)
|
||||||
|
message_element.remove_overlay(self)
|
||||||
|
message_element.action_buttons = None
|
||||||
|
history = window.convert_history_to_ollama(chat)[:list(chat.messages).index(message_element.message_id)]
|
||||||
|
data = {
|
||||||
|
"model": window.model_manager.get_selected_model(),
|
||||||
|
"messages": history,
|
||||||
|
"options": {"temperature": window.ollama_instance.tweaks["temperature"], "seed": window.ollama_instance.tweaks["seed"]},
|
||||||
|
"keep_alive": f"{window.ollama_instance.tweaks['keep_alive']}m"
|
||||||
|
}
|
||||||
|
thread = threading.Thread(target=window.run_message, args=(data, message_element, chat))
|
||||||
|
thread.start()
|
||||||
|
else:
|
||||||
|
window.show_toast(_("Message cannot be regenerated while receiving a response"), window.main_overlay)
|
||||||
|
|
||||||
|
def edit_message(self):
|
||||||
|
logger.debug("Editing message")
|
||||||
|
self.get_parent().action_buttons.set_visible(False)
|
||||||
|
for child in self.get_parent().content_children:
|
||||||
|
self.get_parent().container.remove(child)
|
||||||
|
self.get_parent().content_children = []
|
||||||
|
self.get_parent().container.remove(self.get_parent().footer)
|
||||||
|
self.get_parent().footer = None
|
||||||
|
edit_text_b = edit_text_block(self.get_parent().text)
|
||||||
|
self.get_parent().container.append(edit_text_b)
|
||||||
|
window.set_focus(edit_text_b)
|
||||||
|
|
||||||
|
|
||||||
|
class message(Gtk.Overlay):
|
||||||
|
__gtype_name__ = 'AlpacaMessage'
|
||||||
|
|
||||||
|
def __init__(self, message_id:str, model:str=None):
|
||||||
|
self.message_id = message_id
|
||||||
|
self.bot = model != None
|
||||||
|
self.dt = None
|
||||||
|
self.model = model
|
||||||
|
self.action_buttons = None
|
||||||
|
self.content_children = [] #These are the code blocks, text blocks and tables
|
||||||
|
self.footer = None
|
||||||
|
self.image_c = None
|
||||||
|
self.attachment_c = None
|
||||||
|
self.spinner = None
|
||||||
|
self.text = None
|
||||||
|
|
||||||
|
self.container = Gtk.Box(
|
||||||
|
orientation=1,
|
||||||
|
halign='fill',
|
||||||
|
css_classes=["response_message"] if self.bot else ["card", "user_message"],
|
||||||
|
spacing=12
|
||||||
|
)
|
||||||
|
|
||||||
|
super().__init__(css_classes=["message"], name=message_id)
|
||||||
|
self.set_child(self.container)
|
||||||
|
|
||||||
|
def add_attachments(self, attachments:dict):
|
||||||
|
self.attachment_c = attachment_container()
|
||||||
|
self.container.append(self.attachment_c)
|
||||||
|
for file_path, file_type in attachments.items():
|
||||||
|
file = attachment(os.path.basename(file_path), file_path, file_type)
|
||||||
|
self.attachment_c.add_file(file)
|
||||||
|
|
||||||
|
def add_images(self, images:list):
|
||||||
|
self.image_c = image_container()
|
||||||
|
self.container.append(self.image_c)
|
||||||
|
for image_path in images:
|
||||||
|
image_element = image(image_path)
|
||||||
|
self.image_c.add_image(image_element)
|
||||||
|
|
||||||
|
def add_footer(self, dt:datetime.datetime):
|
||||||
|
self.dt = dt
|
||||||
|
self.footer = footer(self.dt, self.model)
|
||||||
|
self.container.append(self.footer)
|
||||||
|
|
||||||
|
def add_action_buttons(self):
|
||||||
|
if not self.action_buttons:
|
||||||
|
self.action_buttons = action_buttons(self.bot)
|
||||||
|
self.add_overlay(self.action_buttons)
|
||||||
|
|
||||||
|
def update_message(self, data:dict):
|
||||||
|
chat = self.get_parent().get_parent().get_parent().get_parent()
|
||||||
|
if chat.busy:
|
||||||
|
vadjustment = chat.get_vadjustment()
|
||||||
|
if self.spinner:
|
||||||
|
self.container.remove(self.spinner)
|
||||||
|
self.spinner = None
|
||||||
|
self.content_children[-1].set_visible(True)
|
||||||
|
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper())
|
||||||
|
elif vadjustment.get_value() + 50 >= vadjustment.get_upper() - vadjustment.get_page_size():
|
||||||
|
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper() - vadjustment.get_page_size())
|
||||||
|
self.content_children[-1].insert_at_end(data['message']['content'], False)
|
||||||
|
if 'done' in data and data['done']:
|
||||||
|
window.chat_list_box.get_tab_by_name(chat.get_name()).spinner.set_visible(False)
|
||||||
|
if window.chat_list_box.get_current_chat().get_name() != chat.get_name():
|
||||||
|
window.chat_list_box.get_tab_by_name(chat.get_name()).indicator.set_visible(True)
|
||||||
|
if chat.welcome_screen:
|
||||||
|
chat.container.remove(chat.welcome_screen)
|
||||||
|
chat.welcome_screen = None
|
||||||
|
chat.stop_message()
|
||||||
|
self.set_text(self.content_children[-1].get_label())
|
||||||
|
self.dt = datetime.datetime.now()
|
||||||
|
self.add_footer(self.dt)
|
||||||
|
window.show_notification(chat.get_name(), self.text[:200] + (self.text[200:] and '...'), Gio.ThemedIcon.new("chat-message-new-symbolic"))
|
||||||
|
window.save_history(chat)
|
||||||
|
else:
|
||||||
|
sys.exit()
|
||||||
|
|
||||||
|
def set_text(self, text:str=None):
|
||||||
|
self.text = text
|
||||||
|
for child in self.content_children:
|
||||||
|
self.container.remove(child)
|
||||||
|
self.content_children = []
|
||||||
|
if text:
|
||||||
|
self.content_children = []
|
||||||
|
code_block_pattern = re.compile(r'```(\w+)\n(.*?)\n```', re.DOTALL)
|
||||||
|
no_lang_code_block_pattern = re.compile(r'`\n(.*?)\n`', re.DOTALL)
|
||||||
|
table_pattern = re.compile(r'((\r?\n){2}|^)([^\r\n]*\|[^\r\n]*(\r?\n)?)+(?=(\r?\n){2}|$)', re.MULTILINE)
|
||||||
|
bold_pattern = re.compile(r'\*\*(.*?)\*\*') #"**text**"
|
||||||
|
code_pattern = re.compile(r'`([^`\n]*?)`') #"`text`"
|
||||||
|
h1_pattern = re.compile(r'^#\s(.*)$') #"# text"
|
||||||
|
h2_pattern = re.compile(r'^##\s(.*)$') #"## text"
|
||||||
|
markup_pattern = re.compile(r'<(b|u|tt|span.*)>(.*?)<\/(b|u|tt|span)>') #heh butt span, I'm so funny
|
||||||
|
parts = []
|
||||||
|
pos = 0
|
||||||
|
# Code blocks
|
||||||
|
for match in code_block_pattern.finditer(self.text):
|
||||||
|
start, end = match.span()
|
||||||
|
if pos < start:
|
||||||
|
normal_text = self.text[pos:start]
|
||||||
|
parts.append({"type": "normal", "text": normal_text.strip()})
|
||||||
|
language = match.group(1)
|
||||||
|
code_text = match.group(2)
|
||||||
|
parts.append({"type": "code", "text": code_text, "language": 'python3' if language == 'python' else language})
|
||||||
|
pos = end
|
||||||
|
# Code blocks (No language)
|
||||||
|
for match in no_lang_code_block_pattern.finditer(self.text):
|
||||||
|
start, end = match.span()
|
||||||
|
if pos < start:
|
||||||
|
normal_text = self.text[pos:start]
|
||||||
|
parts.append({"type": "normal", "text": normal_text.strip()})
|
||||||
|
code_text = match.group(1)
|
||||||
|
parts.append({"type": "code", "text": code_text, "language": None})
|
||||||
|
pos = end
|
||||||
|
# Tables
|
||||||
|
for match in table_pattern.finditer(self.text):
|
||||||
|
start, end = match.span()
|
||||||
|
if pos < start:
|
||||||
|
normal_text = self.text[pos:start]
|
||||||
|
parts.append({"type": "normal", "text": normal_text.strip()})
|
||||||
|
table_text = match.group(0)
|
||||||
|
parts.append({"type": "table", "text": table_text})
|
||||||
|
pos = end
|
||||||
|
# Text blocks
|
||||||
|
if pos < len(text):
|
||||||
|
normal_text = text[pos:]
|
||||||
|
if normal_text.strip():
|
||||||
|
parts.append({"type": "normal", "text": normal_text.strip()})
|
||||||
|
|
||||||
|
for part in parts:
|
||||||
|
if part['type'] == 'normal':
|
||||||
|
text_b = text_block(self.bot)
|
||||||
|
part['text'] = part['text'].replace("\n* ", "\n• ")
|
||||||
|
part['text'] = code_pattern.sub(r'<tt>\1</tt>', part['text'])
|
||||||
|
part['text'] = bold_pattern.sub(r'<b>\1</b>', part['text'])
|
||||||
|
part['text'] = h1_pattern.sub(r'<span size="x-large">\1</span>', part['text'])
|
||||||
|
part['text'] = h2_pattern.sub(r'<span size="large">\1</span>', part['text'])
|
||||||
|
pos = 0
|
||||||
|
for match in markup_pattern.finditer(part['text']):
|
||||||
|
start, end = match.span()
|
||||||
|
if pos < start:
|
||||||
|
text_b.insert_at_end(part['text'][pos:start], False)
|
||||||
|
text_b.insert_at_end(match.group(0), True)
|
||||||
|
pos = end
|
||||||
|
|
||||||
|
if pos < len(part['text']):
|
||||||
|
text_b.insert_at_end(part['text'][pos:], False)
|
||||||
|
self.content_children.append(text_b)
|
||||||
|
self.container.append(text_b)
|
||||||
|
elif part['type'] == 'code':
|
||||||
|
code_b = code_block(part['text'], part['language'])
|
||||||
|
self.content_children.append(code_b)
|
||||||
|
self.container.append(code_b)
|
||||||
|
elif part['type'] == 'table':
|
||||||
|
table_w = TableWidget(part['text'])
|
||||||
|
self.content_children.append(table_w)
|
||||||
|
self.container.append(table_w)
|
||||||
|
self.add_action_buttons()
|
||||||
|
else:
|
||||||
|
text_b = text_block(self.bot)
|
||||||
|
text_b.set_visible(False)
|
||||||
|
self.content_children.append(text_b)
|
||||||
|
self.spinner = Gtk.Spinner(spinning=True, margin_top=12, margin_bottom=12, hexpand=True)
|
||||||
|
self.container.append(self.spinner)
|
||||||
|
self.container.append(text_b)
|
||||||
|
self.container.queue_draw()
|
||||||
|
|
||||||
532
src/custom_widgets/model_widget.py
Normal file
532
src/custom_widgets/model_widget.py
Normal file
@@ -0,0 +1,532 @@
|
|||||||
|
#model_widget.py
|
||||||
|
"""
|
||||||
|
Handles the model widget (testing)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import gi
|
||||||
|
gi.require_version('Gtk', '4.0')
|
||||||
|
gi.require_version('GtkSource', '5')
|
||||||
|
from gi.repository import Gtk, GObject, Gio, Adw, GtkSource, GLib, Gdk
|
||||||
|
import logging, os, datetime, re, shutil, threading, json, sys
|
||||||
|
from ..internal import config_dir, data_dir, cache_dir, source_dir
|
||||||
|
from .. import available_models_descriptions, dialogs
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
window = None
|
||||||
|
|
||||||
|
class model_selector_popup(Gtk.Popover):
|
||||||
|
__gtype_name__ = 'AlpacaModelSelectorPopup'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
manage_models_button = Gtk.Button(
|
||||||
|
tooltip_text=_('Manage Models'),
|
||||||
|
child=Gtk.Label(label=_('Manage Models'), halign=1),
|
||||||
|
hexpand=True,
|
||||||
|
css_classes=['manage_models_button', 'flat']
|
||||||
|
)
|
||||||
|
manage_models_button.set_action_name("app.manage_models")
|
||||||
|
manage_models_button.connect("clicked", lambda *_: self.hide())
|
||||||
|
self.model_list_box = Gtk.ListBox(
|
||||||
|
css_classes=['navigation-sidebar', 'model_list_box'],
|
||||||
|
height_request=0
|
||||||
|
)
|
||||||
|
container = Gtk.Box(
|
||||||
|
orientation=1,
|
||||||
|
spacing=5
|
||||||
|
)
|
||||||
|
container.append(self.model_list_box)
|
||||||
|
container.append(Gtk.Separator())
|
||||||
|
container.append(manage_models_button)
|
||||||
|
|
||||||
|
scroller = Gtk.ScrolledWindow(
|
||||||
|
max_content_height=300,
|
||||||
|
propagate_natural_width=True,
|
||||||
|
propagate_natural_height=True,
|
||||||
|
child=container
|
||||||
|
)
|
||||||
|
|
||||||
|
super().__init__(
|
||||||
|
css_classes=['model_popover'],
|
||||||
|
has_arrow=False,
|
||||||
|
child=scroller
|
||||||
|
)
|
||||||
|
|
||||||
|
class model_selector_button(Gtk.MenuButton):
|
||||||
|
__gtype_name__ = 'AlpacaModelSelectorButton'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.popover = model_selector_popup()
|
||||||
|
self.popover.model_list_box.connect('selected-rows-changed', self.model_changed)
|
||||||
|
self.popover.model_list_box.connect('row-activated', lambda *_: self.get_popover().hide())
|
||||||
|
container = Gtk.Box(
|
||||||
|
orientation=0,
|
||||||
|
spacing=5
|
||||||
|
)
|
||||||
|
self.label = Gtk.Label(label=_('Select a Model'))
|
||||||
|
container.append(self.label)
|
||||||
|
container.append(Gtk.Image.new_from_icon_name("down-symbolic"))
|
||||||
|
super().__init__(
|
||||||
|
tooltip_text=_('Select a Model'),
|
||||||
|
child=container,
|
||||||
|
popover=self.popover
|
||||||
|
)
|
||||||
|
|
||||||
|
def change_model(self, model_name:str):
|
||||||
|
for model_row in list(self.get_popover().model_list_box):
|
||||||
|
if model_name == model_row.get_name():
|
||||||
|
self.get_popover().model_list_box.select_row(model_row)
|
||||||
|
break
|
||||||
|
|
||||||
|
def model_changed(self, listbox:Gtk.ListBox):
|
||||||
|
row = listbox.get_selected_row()
|
||||||
|
if row:
|
||||||
|
model_name = row.get_name()
|
||||||
|
self.label.set_label(window.convert_model_name(model_name, 0))
|
||||||
|
self.set_tooltip_text(window.convert_model_name(model_name, 0))
|
||||||
|
elif len(list(listbox)) == 0:
|
||||||
|
self.label.set_label(_("Select a Model"))
|
||||||
|
self.set_tooltip_text(_("Select a Model"))
|
||||||
|
window.model_manager.verify_if_image_can_be_used()
|
||||||
|
|
||||||
|
def add_model(self, model_name:str):
|
||||||
|
model_row = Gtk.ListBoxRow(
|
||||||
|
child = Gtk.Label(
|
||||||
|
label=window.convert_model_name(model_name, 0),
|
||||||
|
halign=1,
|
||||||
|
hexpand=True
|
||||||
|
),
|
||||||
|
halign=0,
|
||||||
|
hexpand=True,
|
||||||
|
name=model_name,
|
||||||
|
tooltip_text=window.convert_model_name(model_name, 0)
|
||||||
|
)
|
||||||
|
self.get_popover().model_list_box.append(model_row)
|
||||||
|
self.change_model(model_name)
|
||||||
|
|
||||||
|
def remove_model(self, model_name:str):
|
||||||
|
self.get_popover().model_list_box.remove(next((model for model in list(self.get_popover().model_list_box) if model.get_name() == model_name), None))
|
||||||
|
self.model_changed(self.get_popover().model_list_box)
|
||||||
|
|
||||||
|
def clear_list(self):
|
||||||
|
self.get_popover().model_list_box.remove_all()
|
||||||
|
|
||||||
|
class pulling_model(Gtk.ListBoxRow):
|
||||||
|
__gtype_name__ = 'AlpacaPullingModel'
|
||||||
|
|
||||||
|
def __init__(self, model_name:str):
|
||||||
|
model_label = Gtk.Label(
|
||||||
|
css_classes=["heading"],
|
||||||
|
label=model_name.split(":")[0].replace("-", " ").title(),
|
||||||
|
hexpand=True,
|
||||||
|
halign=1
|
||||||
|
)
|
||||||
|
tag_label = Gtk.Label(
|
||||||
|
css_classes=["subtitle"],
|
||||||
|
label=model_name.split(":")[1]
|
||||||
|
)
|
||||||
|
self.prc_label = Gtk.Label(
|
||||||
|
css_classes=["subtitle", "numeric"],
|
||||||
|
label='50%',
|
||||||
|
hexpand=True,
|
||||||
|
halign=2
|
||||||
|
)
|
||||||
|
subtitle_box = Gtk.Box(
|
||||||
|
hexpand=True,
|
||||||
|
spacing=5,
|
||||||
|
orientation=0
|
||||||
|
)
|
||||||
|
subtitle_box.append(tag_label)
|
||||||
|
subtitle_box.append(self.prc_label)
|
||||||
|
self.progress_bar = Gtk.ProgressBar(
|
||||||
|
valign=2,
|
||||||
|
show_text=False,
|
||||||
|
css_classes=["horizontal"],
|
||||||
|
fraction=.5
|
||||||
|
)
|
||||||
|
description_box = Gtk.Box(
|
||||||
|
hexpand=True,
|
||||||
|
vexpand=True,
|
||||||
|
spacing=5,
|
||||||
|
orientation=1
|
||||||
|
)
|
||||||
|
description_box.append(model_label)
|
||||||
|
description_box.append(subtitle_box)
|
||||||
|
description_box.append(self.progress_bar)
|
||||||
|
|
||||||
|
stop_button = Gtk.Button(
|
||||||
|
icon_name = "media-playback-stop-symbolic",
|
||||||
|
vexpand = False,
|
||||||
|
valign = 3,
|
||||||
|
css_classes = ["destructive-action", "circular"],
|
||||||
|
tooltip_text = _("Stop Pulling '{}'").format(window.convert_model_name(model_name, 0))
|
||||||
|
)
|
||||||
|
stop_button.connect('clicked', lambda *_: dialogs.stop_pull_model(window, self))
|
||||||
|
|
||||||
|
container_box = Gtk.Box(
|
||||||
|
hexpand=True,
|
||||||
|
vexpand=True,
|
||||||
|
spacing=10,
|
||||||
|
orientation=0,
|
||||||
|
margin_top=10,
|
||||||
|
margin_bottom=10,
|
||||||
|
margin_start=10,
|
||||||
|
margin_end=10
|
||||||
|
)
|
||||||
|
|
||||||
|
container_box.append(description_box)
|
||||||
|
container_box.append(stop_button)
|
||||||
|
|
||||||
|
super().__init__(
|
||||||
|
child=container_box,
|
||||||
|
name=model_name
|
||||||
|
)
|
||||||
|
self.error = None
|
||||||
|
|
||||||
|
def update(self, data):
|
||||||
|
if not self.get_parent():
|
||||||
|
sys.exit()
|
||||||
|
if 'error' in data:
|
||||||
|
self.error = data['error']
|
||||||
|
if 'total' in data and 'completed' in data:
|
||||||
|
fraction = round(data['completed'] / data['total'], 4)
|
||||||
|
GLib.idle_add(self.prc_label.set_label, f"{fraction:05.2%}")
|
||||||
|
GLib.idle_add(self.progress_bar.set_fraction, fraction)
|
||||||
|
else:
|
||||||
|
GLib.idle_add(self.prc_label.set_label, data['status'])
|
||||||
|
GLib.idle_add(self.progress_bar.pulse)
|
||||||
|
|
||||||
|
class pulling_model_list(Gtk.ListBox):
|
||||||
|
__gtype_name__ = 'AlpacaPullingModelList'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__(
|
||||||
|
selection_mode=0,
|
||||||
|
css_classes=["boxed-list"],
|
||||||
|
visible=False
|
||||||
|
)
|
||||||
|
|
||||||
|
class local_model(Gtk.ListBoxRow):
|
||||||
|
__gtype_name__ = 'AlpacaLocalModel'
|
||||||
|
|
||||||
|
def __init__(self, model_name:str):
|
||||||
|
model_title = window.convert_model_name(model_name, 0)
|
||||||
|
|
||||||
|
model_label = Gtk.Label(
|
||||||
|
css_classes=["heading"],
|
||||||
|
label=model_title.split(" (")[0],
|
||||||
|
hexpand=True,
|
||||||
|
halign=1
|
||||||
|
)
|
||||||
|
tag_label = Gtk.Label(
|
||||||
|
css_classes=["subtitle"],
|
||||||
|
label=model_title.split(" (")[1][:-1],
|
||||||
|
hexpand=True,
|
||||||
|
halign=1
|
||||||
|
)
|
||||||
|
description_box = Gtk.Box(
|
||||||
|
hexpand=True,
|
||||||
|
vexpand=True,
|
||||||
|
spacing=5,
|
||||||
|
orientation=1
|
||||||
|
)
|
||||||
|
description_box.append(model_label)
|
||||||
|
description_box.append(tag_label)
|
||||||
|
|
||||||
|
delete_button = Gtk.Button(
|
||||||
|
icon_name = "user-trash-symbolic",
|
||||||
|
vexpand = False,
|
||||||
|
valign = 3,
|
||||||
|
css_classes = ["destructive-action", "circular"],
|
||||||
|
tooltip_text = _("Remove '{}'").format(window.convert_model_name(model_name, 0))
|
||||||
|
)
|
||||||
|
delete_button.connect('clicked', lambda *_, model_name=model_name: dialogs.delete_model(window, model_name))
|
||||||
|
|
||||||
|
container_box = Gtk.Box(
|
||||||
|
hexpand=True,
|
||||||
|
vexpand=True,
|
||||||
|
spacing=10,
|
||||||
|
orientation=0,
|
||||||
|
margin_top=10,
|
||||||
|
margin_bottom=10,
|
||||||
|
margin_start=10,
|
||||||
|
margin_end=10
|
||||||
|
)
|
||||||
|
container_box.append(description_box)
|
||||||
|
container_box.append(delete_button)
|
||||||
|
|
||||||
|
super().__init__(
|
||||||
|
child=container_box,
|
||||||
|
name=model_name
|
||||||
|
)
|
||||||
|
|
||||||
|
class local_model_list(Gtk.ListBox):
|
||||||
|
__gtype_name__ = 'AlpacaLocalModelList'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__(
|
||||||
|
selection_mode=0,
|
||||||
|
css_classes=["boxed-list"],
|
||||||
|
visible=False
|
||||||
|
)
|
||||||
|
|
||||||
|
def add_model(self, model_name:str):
|
||||||
|
model = local_model(model_name)
|
||||||
|
self.append(model)
|
||||||
|
if not self.get_visible():
|
||||||
|
self.set_visible(True)
|
||||||
|
|
||||||
|
def remove_model(self, model_name:str):
|
||||||
|
self.remove(next((model for model in list(self) if model.get_name() == model_name), None))
|
||||||
|
|
||||||
|
class available_model(Gtk.ListBoxRow):
|
||||||
|
__gtype_name__ = 'AlpacaAvailableModel'
|
||||||
|
|
||||||
|
def __init__(self, model_name:str, model_author:str, model_description:str, image_recognition:bool):
|
||||||
|
self.model_description = model_description
|
||||||
|
self.model_title = model_name.replace("-", " ").title()
|
||||||
|
self.model_author = model_author
|
||||||
|
self.image_recognition = image_recognition
|
||||||
|
model_label = Gtk.Label(
|
||||||
|
css_classes=["heading"],
|
||||||
|
label="<b>{}</b> <small>by {}</small>".format(self.model_title, self.model_author),
|
||||||
|
hexpand=True,
|
||||||
|
halign=1,
|
||||||
|
use_markup=True
|
||||||
|
)
|
||||||
|
description_label = Gtk.Label(
|
||||||
|
css_classes=["subtitle"],
|
||||||
|
label=self.model_description,
|
||||||
|
hexpand=True,
|
||||||
|
halign=1,
|
||||||
|
wrap=True,
|
||||||
|
wrap_mode=0,
|
||||||
|
)
|
||||||
|
image_recognition_indicator = Gtk.Button(
|
||||||
|
css_classes=["success", "pill", "image_recognition_indicator"],
|
||||||
|
child=Gtk.Label(
|
||||||
|
label=_("Image Recognition"),
|
||||||
|
css_classes=["subtitle"]
|
||||||
|
),
|
||||||
|
halign=1
|
||||||
|
)
|
||||||
|
description_box = Gtk.Box(
|
||||||
|
hexpand=True,
|
||||||
|
vexpand=True,
|
||||||
|
spacing=5,
|
||||||
|
orientation=1
|
||||||
|
)
|
||||||
|
description_box.append(model_label)
|
||||||
|
description_box.append(description_label)
|
||||||
|
if self.image_recognition: description_box.append(image_recognition_indicator)
|
||||||
|
|
||||||
|
container_box = Gtk.Box(
|
||||||
|
hexpand=True,
|
||||||
|
vexpand=True,
|
||||||
|
spacing=10,
|
||||||
|
orientation=0,
|
||||||
|
margin_top=10,
|
||||||
|
margin_bottom=10,
|
||||||
|
margin_start=10,
|
||||||
|
margin_end=10
|
||||||
|
)
|
||||||
|
next_icon = Gtk.Image.new_from_icon_name("go-next")
|
||||||
|
next_icon.update_property([4], [_("Enter download menu for {}").format(self.model_title)])
|
||||||
|
|
||||||
|
container_box.append(description_box)
|
||||||
|
container_box.append(next_icon)
|
||||||
|
|
||||||
|
super().__init__(
|
||||||
|
child=container_box,
|
||||||
|
name=model_name
|
||||||
|
)
|
||||||
|
|
||||||
|
gesture_click = Gtk.GestureClick.new()
|
||||||
|
gesture_click.connect("pressed", lambda *_: self.show_pull_menu())
|
||||||
|
|
||||||
|
event_controller_key = Gtk.EventControllerKey.new()
|
||||||
|
event_controller_key.connect("key-pressed", lambda controller, key, *_: self.show_pull_menu() if key in (Gdk.KEY_space, Gdk.KEY_Return) else None)
|
||||||
|
|
||||||
|
self.add_controller(gesture_click)
|
||||||
|
self.add_controller(event_controller_key)
|
||||||
|
|
||||||
|
def confirm_pull_model(self, model_name):
|
||||||
|
threading.Thread(target=window.model_manager.pull_model, args=(model_name,)).start()
|
||||||
|
window.navigation_view_manage_models.pop()
|
||||||
|
|
||||||
|
def show_pull_menu(self):
|
||||||
|
with open(os.path.join(source_dir, 'available_models.json'), 'r', encoding="utf-8") as f:
|
||||||
|
data = json.load(f)
|
||||||
|
window.navigation_view_manage_models.push_by_tag('model_tags_page')
|
||||||
|
window.navigation_view_manage_models.find_page('model_tags_page').set_title(self.get_name().replace("-", " ").title())
|
||||||
|
window.model_link_button.set_name(data[self.get_name()]['url'])
|
||||||
|
window.model_link_button.set_tooltip_text(data[self.get_name()]['url'])
|
||||||
|
window.model_tag_list_box.remove_all()
|
||||||
|
tags = data[self.get_name()]['tags']
|
||||||
|
|
||||||
|
for tag_data in tags:
|
||||||
|
if f"{self.get_name()}:{tag_data[0]}" not in window.model_manager.get_model_list():
|
||||||
|
tag_row = Adw.ActionRow(
|
||||||
|
title = tag_data[0],
|
||||||
|
subtitle = tag_data[1],
|
||||||
|
name = f"{self.get_name()}:{tag_data[0]}"
|
||||||
|
)
|
||||||
|
download_icon = Gtk.Image.new_from_icon_name("folder-download-symbolic")
|
||||||
|
tag_row.add_suffix(download_icon)
|
||||||
|
download_icon.update_property([4], [_("Download {}:{}").format(self.get_name(), tag_data[0])])
|
||||||
|
|
||||||
|
gesture_click = Gtk.GestureClick.new()
|
||||||
|
gesture_click.connect("pressed", lambda *_, name=f"{self.get_name()}:{tag_data[0]}" : self.confirm_pull_model(name))
|
||||||
|
|
||||||
|
event_controller_key = Gtk.EventControllerKey.new()
|
||||||
|
event_controller_key.connect("key-pressed", lambda controller, key, *_, name=f"{self.get_name()}:{tag_data[0]}" : self.confirm_pull_model(name) if key in (Gdk.KEY_space, Gdk.KEY_Return) else None)
|
||||||
|
|
||||||
|
tag_row.add_controller(gesture_click)
|
||||||
|
tag_row.add_controller(event_controller_key)
|
||||||
|
|
||||||
|
window.model_tag_list_box.append(tag_row)
|
||||||
|
|
||||||
|
class available_model_list(Gtk.ListBox):
|
||||||
|
__gtype_name__ = 'AlpacaAvailableModelList'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__(
|
||||||
|
selection_mode=0,
|
||||||
|
css_classes=["boxed-list"],
|
||||||
|
visible=False
|
||||||
|
)
|
||||||
|
|
||||||
|
def add_model(self, model_name:str, model_author:str, model_description:str, image_recognition:bool):
|
||||||
|
model = available_model(model_name, model_author, model_description, image_recognition)
|
||||||
|
self.append(model)
|
||||||
|
if not self.get_visible():
|
||||||
|
self.set_visible(True)
|
||||||
|
|
||||||
|
class model_manager_container(Gtk.Box):
|
||||||
|
__gtype_name__ = 'AlpacaModelManagerContainer'
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__(
|
||||||
|
margin_top=12,
|
||||||
|
margin_bottom=12,
|
||||||
|
margin_start=12,
|
||||||
|
margin_end=12,
|
||||||
|
spacing=12,
|
||||||
|
orientation=1
|
||||||
|
)
|
||||||
|
self.pulling_list = pulling_model_list()
|
||||||
|
self.append(self.pulling_list)
|
||||||
|
self.local_list = local_model_list()
|
||||||
|
self.append(self.local_list)
|
||||||
|
self.available_list = available_model_list()
|
||||||
|
self.append(self.available_list)
|
||||||
|
self.model_selector = model_selector_button()
|
||||||
|
window.header_bar.set_title_widget(self.model_selector)
|
||||||
|
|
||||||
|
def add_local_model(self, model_name:str):
|
||||||
|
self.local_list.add_model(model_name)
|
||||||
|
if not self.local_list.get_visible():
|
||||||
|
self.local_list.set_visible(True)
|
||||||
|
self.model_selector.add_model(model_name)
|
||||||
|
|
||||||
|
def remove_local_model(self, model_name:str):
|
||||||
|
logger.debug("Deleting model")
|
||||||
|
response = window.ollama_instance.request("DELETE", "api/delete", json.dumps({"name": model_name}))
|
||||||
|
|
||||||
|
if response.status_code == 200:
|
||||||
|
self.local_list.remove_model(model_name)
|
||||||
|
self.model_selector.remove_model(model_name)
|
||||||
|
if len(self.get_model_list()) == 0:
|
||||||
|
self.local_list.set_visible(False)
|
||||||
|
window.chat_list_box.update_welcome_screens(False)
|
||||||
|
window.show_toast(_("Model deleted successfully"), window.manage_models_overlay)
|
||||||
|
else:
|
||||||
|
window.manage_models_dialog.close()
|
||||||
|
window.connection_error()
|
||||||
|
|
||||||
|
def get_selected_model(self) -> str:
|
||||||
|
row = self.model_selector.get_popover().model_list_box.get_selected_row()
|
||||||
|
if row:
|
||||||
|
return row.get_name()
|
||||||
|
|
||||||
|
def get_model_list(self) -> list:
|
||||||
|
return [model.get_name() for model in list(self.model_selector.get_popover().model_list_box)]
|
||||||
|
|
||||||
|
#Should only be called when the app starts
|
||||||
|
def update_local_list(self):
|
||||||
|
try:
|
||||||
|
response = window.ollama_instance.request("GET", "api/tags")
|
||||||
|
if response.status_code == 200:
|
||||||
|
self.local_list.remove_all()
|
||||||
|
data = json.loads(response.text)
|
||||||
|
if len(data['models']) == 0:
|
||||||
|
self.local_list.set_visible(False)
|
||||||
|
else:
|
||||||
|
self.local_list.set_visible(True)
|
||||||
|
for model in data['models']:
|
||||||
|
self.add_local_model(model['name'])
|
||||||
|
else:
|
||||||
|
window.connection_error()
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(e)
|
||||||
|
window.connection_error()
|
||||||
|
|
||||||
|
#Should only be called when the app starts
|
||||||
|
def update_available_list(self):
|
||||||
|
with open(os.path.join(source_dir, 'available_models.json'), 'r', encoding="utf-8") as f:
|
||||||
|
for name, model_info in json.load(f).items():
|
||||||
|
self.available_list.add_model(name, model_info['author'], available_models_descriptions.descriptions[name], model_info['image'])
|
||||||
|
|
||||||
|
def change_model(self, model_name:str):
|
||||||
|
self.model_selector.change_model(model_name)
|
||||||
|
|
||||||
|
def verify_if_image_can_be_used(self):
|
||||||
|
logger.debug("Verifying if image can be used")
|
||||||
|
selected = self.get_selected_model()
|
||||||
|
if selected == None:
|
||||||
|
return False
|
||||||
|
selected = selected.split(":")[0]
|
||||||
|
with open(os.path.join(source_dir, 'available_models.json'), 'r', encoding="utf-8") as f:
|
||||||
|
if selected in [key for key, value in json.load(f).items() if value["image"]]:
|
||||||
|
for name, content in window.attachments.items():
|
||||||
|
if content['type'] == 'image':
|
||||||
|
content['button'].set_css_classes(["flat"])
|
||||||
|
return True
|
||||||
|
for name, content in window.attachments.items():
|
||||||
|
if content['type'] == 'image':
|
||||||
|
content['button'].set_css_classes(["flat", "error"])
|
||||||
|
return False
|
||||||
|
|
||||||
|
def pull_model(self, model_name:str, modelfile:str=None):
|
||||||
|
if ':' not in model_name:
|
||||||
|
model_name += ':latest'
|
||||||
|
if model_name not in [model.get_name() for model in list(self.pulling_list)] and model_name not in [model.get_name() for model in list(self.local_list)]:
|
||||||
|
logger.info("Pulling model: {}".format(model_name))
|
||||||
|
model = pulling_model(model_name)
|
||||||
|
self.pulling_list.append(model)
|
||||||
|
if not self.pulling_list.get_visible():
|
||||||
|
GLib.idle_add(self.pulling_list.set_visible, True)
|
||||||
|
|
||||||
|
if modelfile:
|
||||||
|
response = window.ollama_instance.request("POST", "api/create", json.dumps({"name": model_name, "modelfile": modelfile}), lambda data: model.update(data))
|
||||||
|
else:
|
||||||
|
response = window.ollama_instance.request("POST", "api/pull", json.dumps({"name": model_name}), lambda data: model.update(data))
|
||||||
|
|
||||||
|
if response.status_code == 200 and not model.error:
|
||||||
|
GLib.idle_add(window.show_notification, _("Task Complete"), _("Model '{}' pulled successfully.").format(model_name), Gio.ThemedIcon.new("emblem-ok-symbolic"))
|
||||||
|
GLib.idle_add(window.show_toast, _("Model '{}' pulled successfully.").format(model_name), window.manage_models_overlay)
|
||||||
|
self.add_local_model(model_name)
|
||||||
|
elif response.status_code == 200:
|
||||||
|
GLib.idle_add(window.show_notification, _("Pull Model Error"), _("Failed to pull model '{}': {}").format(model_name, model.error), Gio.ThemedIcon.new("dialog-error-symbolic"))
|
||||||
|
GLib.idle_add(window.show_toast, _("Error pulling '{}': {}").format(model_name, model.error), window.manage_models_overlay)
|
||||||
|
else:
|
||||||
|
GLib.idle_add(window.show_notification, _("Pull Model Error"), _("Failed to pull model '{}' due to network error.").format(model_name), Gio.ThemedIcon.new("dialog-error-symbolic"))
|
||||||
|
GLib.idle_add(window.show_toast, _("Error pulling '{}'").format(model_name), window.manage_models_overlay)
|
||||||
|
GLib.idle_add(window.manage_models_dialog.close)
|
||||||
|
GLib.idle_add(window.connection_error)
|
||||||
|
|
||||||
|
self.pulling_list.remove(model)
|
||||||
|
GLib.idle_add(window.chat_list_box.update_welcome_screens, len(self.get_model_list()) > 0)
|
||||||
|
if len(list(self.pulling_list)) == 0:
|
||||||
|
GLib.idle_add(self.pulling_list.set_visible, False)
|
||||||
|
|
||||||
@@ -37,7 +37,8 @@ class TableWidget(Gtk.Frame):
|
|||||||
|
|
||||||
def __init__(self, markdown):
|
def __init__(self, markdown):
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
self.set_margin_start(5)
|
||||||
|
self.set_margin_end(5)
|
||||||
self.table = MarkdownTable()
|
self.table = MarkdownTable()
|
||||||
|
|
||||||
self.set_halign(Gtk.Align.START)
|
self.set_halign(Gtk.Align.START)
|
||||||
@@ -3,18 +3,18 @@
|
|||||||
Handles UI dialogs
|
Handles UI dialogs
|
||||||
"""
|
"""
|
||||||
import os
|
import os
|
||||||
import logging
|
import logging, requests, threading, shutil
|
||||||
from pytube import YouTube
|
from pytube import YouTube
|
||||||
from html2text import html2text
|
from html2text import html2text
|
||||||
from gi.repository import Adw, Gtk
|
from gi.repository import Adw, Gtk
|
||||||
from . import connection_handler
|
from .internal import cache_dir
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
# CLEAR CHAT | WORKS
|
# CLEAR CHAT | WORKS
|
||||||
|
|
||||||
def clear_chat_response(self, dialog, task):
|
def clear_chat_response(self, dialog, task):
|
||||||
if dialog.choose_finish(task) == "clear":
|
if dialog.choose_finish(task) == "clear":
|
||||||
self.clear_chat()
|
self.chat_list_box.get_current_chat().clear_chat()
|
||||||
|
|
||||||
def clear_chat(self):
|
def clear_chat(self):
|
||||||
if self.bot_message is not None:
|
if self.bot_message is not None:
|
||||||
@@ -39,7 +39,7 @@ def clear_chat(self):
|
|||||||
|
|
||||||
def delete_chat_response(self, dialog, task, chat_name):
|
def delete_chat_response(self, dialog, task, chat_name):
|
||||||
if dialog.choose_finish(task) == "delete":
|
if dialog.choose_finish(task) == "delete":
|
||||||
self.delete_chat(chat_name)
|
self.chat_list_box.delete_chat(chat_name)
|
||||||
|
|
||||||
def delete_chat(self, chat_name):
|
def delete_chat(self, chat_name):
|
||||||
dialog = Adw.AlertDialog(
|
dialog = Adw.AlertDialog(
|
||||||
@@ -59,16 +59,16 @@ def delete_chat(self, chat_name):
|
|||||||
|
|
||||||
# RENAME CHAT | WORKS
|
# RENAME CHAT | WORKS
|
||||||
|
|
||||||
def rename_chat_response(self, dialog, task, old_chat_name, entry, label_element):
|
def rename_chat_response(self, dialog, task, old_chat_name, entry):
|
||||||
if not entry:
|
if not entry:
|
||||||
return
|
return
|
||||||
new_chat_name = entry.get_text()
|
new_chat_name = entry.get_text()
|
||||||
if old_chat_name == new_chat_name:
|
if old_chat_name == new_chat_name:
|
||||||
return
|
return
|
||||||
if new_chat_name and (task is None or dialog.choose_finish(task) == "rename"):
|
if new_chat_name and (task is None or dialog.choose_finish(task) == "rename"):
|
||||||
self.rename_chat(old_chat_name, new_chat_name, label_element)
|
self.chat_list_box.rename_chat(old_chat_name, new_chat_name)
|
||||||
|
|
||||||
def rename_chat(self, chat_name, label_element):
|
def rename_chat(self, chat_name):
|
||||||
entry = Gtk.Entry()
|
entry = Gtk.Entry()
|
||||||
dialog = Adw.AlertDialog(
|
dialog = Adw.AlertDialog(
|
||||||
heading=_("Rename Chat?"),
|
heading=_("Rename Chat?"),
|
||||||
@@ -83,7 +83,7 @@ def rename_chat(self, chat_name, label_element):
|
|||||||
dialog.choose(
|
dialog.choose(
|
||||||
parent = self,
|
parent = self,
|
||||||
cancellable = None,
|
cancellable = None,
|
||||||
callback = lambda dialog, task, old_chat_name=chat_name, entry=entry, label_element=label_element: rename_chat_response(self, dialog, task, old_chat_name, entry, label_element)
|
callback = lambda dialog, task, old_chat_name=chat_name, entry=entry: rename_chat_response(self, dialog, task, old_chat_name, entry)
|
||||||
)
|
)
|
||||||
|
|
||||||
# NEW CHAT | WORKS | UNUSED REASON: The 'Add Chat' button now creates a chat without a name AKA "New Chat"
|
# NEW CHAT | WORKS | UNUSED REASON: The 'Add Chat' button now creates a chat without a name AKA "New Chat"
|
||||||
@@ -116,15 +116,16 @@ def new_chat(self):
|
|||||||
|
|
||||||
# STOP PULL MODEL | WORKS
|
# STOP PULL MODEL | WORKS
|
||||||
|
|
||||||
def stop_pull_model_response(self, dialog, task, model_name):
|
def stop_pull_model_response(self, dialog, task, pulling_model):
|
||||||
if dialog.choose_finish(task) == "stop":
|
if dialog.choose_finish(task) == "stop":
|
||||||
self.stop_pull_model(model_name)
|
if len(list(pulling_model.get_parent())) == 1:
|
||||||
|
pulling_model.get_parent().set_visible(False)
|
||||||
|
pulling_model.get_parent().remove(pulling_model)
|
||||||
|
|
||||||
def stop_pull_model(self, model_name):
|
def stop_pull_model(self, pulling_model):
|
||||||
#self.pulling_model_list_box.unselect_all()
|
|
||||||
dialog = Adw.AlertDialog(
|
dialog = Adw.AlertDialog(
|
||||||
heading=_("Stop Download?"),
|
heading=_("Stop Download?"),
|
||||||
body=_("Are you sure you want to stop pulling '{} ({})'?").format(model_name.split(":")[0].capitalize(), model_name.split(":")[1]),
|
body=_("Are you sure you want to stop pulling '{}'?").format(self.convert_model_name(pulling_model.get_name(), 0)),
|
||||||
close_response="cancel"
|
close_response="cancel"
|
||||||
)
|
)
|
||||||
dialog.add_response("cancel", _("Cancel"))
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
@@ -134,19 +135,19 @@ def stop_pull_model(self, model_name):
|
|||||||
dialog.choose(
|
dialog.choose(
|
||||||
parent = self.manage_models_dialog,
|
parent = self.manage_models_dialog,
|
||||||
cancellable = None,
|
cancellable = None,
|
||||||
callback = lambda dialog, task, model_name = model_name: stop_pull_model_response(self, dialog, task, model_name)
|
callback = lambda dialog, task, model=pulling_model: stop_pull_model_response(self, dialog, task, model)
|
||||||
)
|
)
|
||||||
|
|
||||||
# DELETE MODEL | WORKS
|
# DELETE MODEL | WORKS
|
||||||
|
|
||||||
def delete_model_response(self, dialog, task, model_name):
|
def delete_model_response(self, dialog, task, model_name):
|
||||||
if dialog.choose_finish(task) == "delete":
|
if dialog.choose_finish(task) == "delete":
|
||||||
self.delete_model(model_name)
|
self.model_manager.remove_local_model(model_name)
|
||||||
|
|
||||||
def delete_model(self, model_name):
|
def delete_model(self, model_name):
|
||||||
dialog = Adw.AlertDialog(
|
dialog = Adw.AlertDialog(
|
||||||
heading=_("Delete Model?"),
|
heading=_("Delete Model?"),
|
||||||
body=_("Are you sure you want to delete '{}'?").format(model_name),
|
body=_("Are you sure you want to delete '{}'?").format(self.convert_model_name(model_name, 0)),
|
||||||
close_response="cancel"
|
close_response="cancel"
|
||||||
)
|
)
|
||||||
dialog.add_response("cancel", _("Cancel"))
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
@@ -187,21 +188,27 @@ def remove_attached_file(self, name):
|
|||||||
def reconnect_remote_response(self, dialog, task, url_entry, bearer_entry):
|
def reconnect_remote_response(self, dialog, task, url_entry, bearer_entry):
|
||||||
response = dialog.choose_finish(task)
|
response = dialog.choose_finish(task)
|
||||||
if not task or response == "remote":
|
if not task or response == "remote":
|
||||||
self.connect_remote(url_entry.get_text(), bearer_entry.get_text())
|
self.remote_connection_entry.set_text(url_entry.get_text())
|
||||||
|
self.remote_connection_switch.set_sensitive(url_entry.get_text())
|
||||||
|
self.remote_bearer_token_entry.set_text(bearer_entry.get_text())
|
||||||
|
self.remote_connection_switch.set_active(True)
|
||||||
|
self.model_manager.update_local_list()
|
||||||
elif response == "local":
|
elif response == "local":
|
||||||
self.connect_local()
|
self.ollama_instance.remote = False
|
||||||
|
self.ollama_instance.start()
|
||||||
|
self.model_manager.update_local_list()
|
||||||
elif response == "close":
|
elif response == "close":
|
||||||
self.destroy()
|
self.destroy()
|
||||||
|
|
||||||
def reconnect_remote(self, current_url, current_bearer_token):
|
def reconnect_remote(self):
|
||||||
entry_url = Gtk.Entry(
|
entry_url = Gtk.Entry(
|
||||||
css_classes = ["error"],
|
css_classes = ["error"],
|
||||||
text = current_url,
|
text = self.ollama_instance.remote_url,
|
||||||
placeholder_text = "URL"
|
placeholder_text = "URL"
|
||||||
)
|
)
|
||||||
entry_bearer_token = Gtk.Entry(
|
entry_bearer_token = Gtk.Entry(
|
||||||
css_classes = ["error"] if current_bearer_token else None,
|
css_classes = ["error"] if self.ollama_instance.bearer_token else None,
|
||||||
text = current_bearer_token,
|
text = self.ollama_instance.bearer_token,
|
||||||
placeholder_text = "Bearer Token (Optional)"
|
placeholder_text = "Bearer Token (Optional)"
|
||||||
)
|
)
|
||||||
container = Gtk.Box(
|
container = Gtk.Box(
|
||||||
@@ -216,7 +223,8 @@ def reconnect_remote(self, current_url, current_bearer_token):
|
|||||||
extra_child=container
|
extra_child=container
|
||||||
)
|
)
|
||||||
dialog.add_response("close", _("Close Alpaca"))
|
dialog.add_response("close", _("Close Alpaca"))
|
||||||
dialog.add_response("local", _("Use local instance"))
|
if shutil.which('ollama'):
|
||||||
|
dialog.add_response("local", _("Use local instance"))
|
||||||
dialog.add_response("remote", _("Connect"))
|
dialog.add_response("remote", _("Connect"))
|
||||||
dialog.set_response_appearance("remote", Adw.ResponseAppearance.SUGGESTED)
|
dialog.set_response_appearance("remote", Adw.ResponseAppearance.SUGGESTED)
|
||||||
dialog.set_default_response("remote")
|
dialog.set_default_response("remote")
|
||||||
@@ -235,7 +243,7 @@ def create_model_from_existing_response(self, dialog, task, dropdown):
|
|||||||
|
|
||||||
def create_model_from_existing(self):
|
def create_model_from_existing(self):
|
||||||
string_list = Gtk.StringList()
|
string_list = Gtk.StringList()
|
||||||
for model in self.local_models:
|
for model in self.model_manager.get_model_list():
|
||||||
string_list.append(self.convert_model_name(model, 0))
|
string_list.append(self.convert_model_name(model, 0))
|
||||||
|
|
||||||
dropdown = Gtk.DropDown()
|
dropdown = Gtk.DropDown()
|
||||||
@@ -273,7 +281,7 @@ def create_model_from_file(self):
|
|||||||
def create_model_from_name_response(self, dialog, task, entry):
|
def create_model_from_name_response(self, dialog, task, entry):
|
||||||
model = entry.get_text().lower().strip()
|
model = entry.get_text().lower().strip()
|
||||||
if dialog.choose_finish(task) == 'accept' and model:
|
if dialog.choose_finish(task) == 'accept' and model:
|
||||||
self.pull_model(model)
|
threading.Thread(target=self.model_manager.pull_model, kwargs={"model_name": model}).start()
|
||||||
|
|
||||||
def create_model_from_name(self):
|
def create_model_from_name(self):
|
||||||
entry = Gtk.Entry()
|
entry = Gtk.Entry()
|
||||||
@@ -330,11 +338,11 @@ def youtube_caption_response(self, dialog, task, video_url, caption_drop_down):
|
|||||||
yt = YouTube(video_url)
|
yt = YouTube(video_url)
|
||||||
text = "{}\n{}\n{}\n\n".format(yt.title, yt.author, yt.watch_url)
|
text = "{}\n{}\n{}\n\n".format(yt.title, yt.author, yt.watch_url)
|
||||||
selected_caption = caption_drop_down.get_selected_item().get_string()
|
selected_caption = caption_drop_down.get_selected_item().get_string()
|
||||||
for event in yt.captions[selected_caption.split('(')[1][:-1]].json_captions['events']:
|
for event in yt.captions[selected_caption.split('(')[-1][:-1]].json_captions['events']:
|
||||||
text += "{}\n".format(event['segs'][0]['utf8'].replace('\n', '\\n'))
|
text += "{}\n".format(event['segs'][0]['utf8'].replace('\n', '\\n'))
|
||||||
if not os.path.exists(os.path.join(self.cache_dir, 'tmp/youtube')):
|
if not os.path.exists(os.path.join(cache_dir, 'tmp/youtube')):
|
||||||
os.makedirs(os.path.join(self.cache_dir, 'tmp/youtube'))
|
os.makedirs(os.path.join(cache_dir, 'tmp/youtube'))
|
||||||
file_path = os.path.join(os.path.join(self.cache_dir, 'tmp/youtube'), f'{yt.title} ({selected_caption.split(" | ")[0]})')
|
file_path = os.path.join(os.path.join(cache_dir, 'tmp/youtube'), f'{yt.title} ({selected_caption.split(" (")[0]})')
|
||||||
with open(file_path, 'w+', encoding="utf-8") as f:
|
with open(file_path, 'w+', encoding="utf-8") as f:
|
||||||
f.write(text)
|
f.write(text)
|
||||||
self.attach_file(file_path, 'youtube')
|
self.attach_file(file_path, 'youtube')
|
||||||
@@ -373,7 +381,7 @@ def youtube_caption(self, video_url):
|
|||||||
|
|
||||||
def attach_website_response(self, dialog, task, url):
|
def attach_website_response(self, dialog, task, url):
|
||||||
if dialog.choose_finish(task) == "accept":
|
if dialog.choose_finish(task) == "accept":
|
||||||
response = connection_handler.simple_get(url)
|
response = requests.get(url)
|
||||||
if response.status_code == 200:
|
if response.status_code == 200:
|
||||||
html = response.text
|
html = response.text
|
||||||
md = html2text(html)
|
md = html2text(html)
|
||||||
|
|||||||
2
src/icons/chat-bubble-text-symbolic.svg
Normal file
2
src/icons/chat-bubble-text-symbolic.svg
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 3 0 c -1.644531 0 -3 1.355469 -3 3 v 6 c 0 1.644531 1.355469 3 3 3 h 1 v 4 l 4 -4 h 5 c 1.644531 0 3 -1.355469 3 -3 v -6 c 0 -1.644531 -1.355469 -3 -3 -3 z m 0 2 h 10 c 0.570312 0 1 0.429688 1 1 v 6 c 0 0.570312 -0.429688 1 -1 1 h -10 c -0.570312 0 -1 -0.429688 -1 -1 v -6 c 0 -0.570312 0.429688 -1 1 -1 z m 0 0"/><path d="m 3 3 h 9 v 2 h -9 z m 0 0"/><path d="m 3 6 h 6 v 2 h -6 z m 0 0"/></g></svg>
|
||||||
|
After Width: | Height: | Size: 556 B |
@@ -1,58 +0,0 @@
|
|||||||
# local_instance.py
|
|
||||||
"""
|
|
||||||
Handles running, stopping and resetting the integrated Ollama instance
|
|
||||||
"""
|
|
||||||
import subprocess
|
|
||||||
import threading
|
|
||||||
import os
|
|
||||||
from time import sleep
|
|
||||||
from logging import getLogger
|
|
||||||
from .internal import data_dir, cache_dir
|
|
||||||
|
|
||||||
|
|
||||||
logger = getLogger(__name__)
|
|
||||||
|
|
||||||
instance = None
|
|
||||||
port = 11435
|
|
||||||
overrides = {}
|
|
||||||
|
|
||||||
def log_output(pipe):
|
|
||||||
with open(os.path.join(data_dir, 'tmp.log'), 'a') as f:
|
|
||||||
with pipe:
|
|
||||||
for line in iter(pipe.readline, ''):
|
|
||||||
print(line, end='')
|
|
||||||
f.write(line)
|
|
||||||
f.flush()
|
|
||||||
|
|
||||||
def start():
|
|
||||||
if not os.path.isdir(os.path.join(cache_dir, 'tmp/ollama')):
|
|
||||||
os.mkdir(os.path.join(cache_dir, 'tmp/ollama'))
|
|
||||||
global instance
|
|
||||||
params = overrides.copy()
|
|
||||||
params["OLLAMA_HOST"] = f"127.0.0.1:{port}" # You can't change this directly sorry :3
|
|
||||||
params["HOME"] = data_dir
|
|
||||||
params["TMPDIR"] = os.path.join(cache_dir, 'tmp/ollama')
|
|
||||||
instance = subprocess.Popen(["ollama", "serve"], env={**os.environ, **params}, stderr=subprocess.PIPE, stdout=subprocess.PIPE, text=True)
|
|
||||||
threading.Thread(target=log_output, args=(instance.stdout,)).start()
|
|
||||||
threading.Thread(target=log_output, args=(instance.stderr,)).start()
|
|
||||||
logger.info("Starting Alpaca's Ollama instance...")
|
|
||||||
logger.debug(params)
|
|
||||||
sleep(1)
|
|
||||||
logger.info("Started Alpaca's Ollama instance")
|
|
||||||
v_str = subprocess.check_output("ollama -v", shell=True).decode('utf-8')
|
|
||||||
logger.info('Ollama version: {}'.format(v_str.split('client version is ')[1].strip()))
|
|
||||||
|
|
||||||
def stop():
|
|
||||||
logger.info("Stopping Alpaca's Ollama instance")
|
|
||||||
global instance
|
|
||||||
if instance:
|
|
||||||
instance.terminate()
|
|
||||||
instance.wait()
|
|
||||||
instance = None
|
|
||||||
logger.info("Stopped Alpaca's Ollama instance")
|
|
||||||
|
|
||||||
def reset():
|
|
||||||
logger.info("Resetting Alpaca's Ollama instance")
|
|
||||||
stop()
|
|
||||||
sleep(1)
|
|
||||||
start()
|
|
||||||
@@ -44,7 +44,10 @@ translators = [
|
|||||||
'Aritra Saha (Bengali) https://github.com/olumolu',
|
'Aritra Saha (Bengali) https://github.com/olumolu',
|
||||||
'Yuehao Sui (Simplified Chinese) https://github.com/8ar10der',
|
'Yuehao Sui (Simplified Chinese) https://github.com/8ar10der',
|
||||||
'Aleksana (Simplified Chinese) https://github.com/Aleksanaa',
|
'Aleksana (Simplified Chinese) https://github.com/Aleksanaa',
|
||||||
'Aritra Saha (Hindi) https://github.com/olumolu'
|
'Aritra Saha (Hindi) https://github.com/olumolu',
|
||||||
|
'YusaBecerikli (Turkish) https://github.com/YusaBecerikli',
|
||||||
|
'Simon (Ukrainian) https://github.com/OriginalSimon',
|
||||||
|
'Marcel Margenberg (German) https://github.com/MehrzweckMandala'
|
||||||
]
|
]
|
||||||
|
|
||||||
class AlpacaApplication(Adw.Application):
|
class AlpacaApplication(Adw.Application):
|
||||||
@@ -54,8 +57,9 @@ class AlpacaApplication(Adw.Application):
|
|||||||
super().__init__(application_id='com.jeffser.Alpaca',
|
super().__init__(application_id='com.jeffser.Alpaca',
|
||||||
flags=Gio.ApplicationFlags.DEFAULT_FLAGS)
|
flags=Gio.ApplicationFlags.DEFAULT_FLAGS)
|
||||||
self.create_action('quit', lambda *_: self.props.active_window.closing_app(None), ['<primary>w', '<primary>q'])
|
self.create_action('quit', lambda *_: self.props.active_window.closing_app(None), ['<primary>w', '<primary>q'])
|
||||||
self.create_action('preferences', lambda *_: AlpacaWindow.show_preferences_dialog(self.props.active_window), ['<primary>comma'])
|
self.create_action('preferences', lambda *_: self.props.active_window.preferences_dialog.present(self.props.active_window), ['<primary>comma'])
|
||||||
self.create_action('about', self.on_about_action)
|
self.create_action('about', self.on_about_action)
|
||||||
|
self.set_accels_for_action("win.show-help-overlay", ['<primary>slash'])
|
||||||
self.version = version
|
self.version = version
|
||||||
|
|
||||||
def do_activate(self):
|
def do_activate(self):
|
||||||
|
|||||||
@@ -41,11 +41,17 @@ alpaca_sources = [
|
|||||||
'window.py',
|
'window.py',
|
||||||
'connection_handler.py',
|
'connection_handler.py',
|
||||||
'dialogs.py',
|
'dialogs.py',
|
||||||
'local_instance.py',
|
|
||||||
'available_models.json',
|
'available_models.json',
|
||||||
'available_models_descriptions.py',
|
'available_models_descriptions.py',
|
||||||
'table_widget.py',
|
|
||||||
'internal.py'
|
'internal.py'
|
||||||
]
|
]
|
||||||
|
|
||||||
|
custom_widgets = [
|
||||||
|
'custom_widgets/table_widget.py',
|
||||||
|
'custom_widgets/message_widget.py',
|
||||||
|
'custom_widgets/chat_widget.py',
|
||||||
|
'custom_widgets/model_widget.py'
|
||||||
|
]
|
||||||
|
|
||||||
install_data(alpaca_sources, install_dir: moduledir)
|
install_data(alpaca_sources, install_dir: moduledir)
|
||||||
|
install_data(custom_widgets, install_dir: moduledir / 'custom_widgets')
|
||||||
|
|||||||
@@ -18,13 +18,22 @@
|
|||||||
.model_list_box > * {
|
.model_list_box > * {
|
||||||
margin: 0;
|
margin: 0;
|
||||||
}
|
}
|
||||||
.user_message, .response_message {
|
.user_message > label, .response_message > label {
|
||||||
padding: 12px;
|
padding: 7px;
|
||||||
border-radius: 10px;
|
border-radius: 10px;
|
||||||
}
|
}
|
||||||
.user_message:focus, .response_message:focus, .editing_message_textview:focus, .code_block:focus {
|
.user_message label:focus, .response_message label:focus, .editing_message_textview:focus, .code_block:focus {
|
||||||
box-shadow: 0 0 1px 2px mix(@accent_color, @window_bg_color, 0.5);
|
box-shadow: 0 0 1px 2px mix(@accent_color, @window_bg_color, 0.5);
|
||||||
}
|
}
|
||||||
.model_popover {
|
.model_popover {
|
||||||
margin-top: 6px;
|
margin-top: 6px;
|
||||||
}
|
}
|
||||||
|
stacksidebar {
|
||||||
|
border: none;
|
||||||
|
}
|
||||||
|
.image_recognition_indicator {
|
||||||
|
padding: 0px 10px;
|
||||||
|
}
|
||||||
|
.code_block {
|
||||||
|
font-family: monospace;
|
||||||
|
}
|
||||||
1716
src/window.py
1716
src/window.py
File diff suppressed because it is too large
Load Diff
566
src/window.ui
566
src/window.ui
@@ -17,317 +17,209 @@
|
|||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<property name="content">
|
<property name="content">
|
||||||
<object class="AdwOverlaySplitView" id="split_view_overlay">
|
<object class="AdwOverlaySplitView" id="split_view_overlay">
|
||||||
<property name="show-sidebar" bind-source="show_sidebar_button" bind-property="active" bind-flags="sync-create"/>
|
<property name="show-sidebar" bind-source="show_sidebar_button" bind-property="active" bind-flags="sync-create"/>
|
||||||
<property name="sidebar-width-fraction">0.4</property>
|
<property name="sidebar-width-fraction">0.4</property>
|
||||||
<property name="sidebar">
|
<property name="sidebar">
|
||||||
<object class="AdwToolbarView">
|
<object class="AdwToolbarView">
|
||||||
<child type="top">
|
<child type="top">
|
||||||
<object class="AdwHeaderBar">
|
<object class="AdwHeaderBar">
|
||||||
<child type="start">
|
<child type="start">
|
||||||
<object class="GtkButton" id="add_chat_button">
|
<object class="GtkButton" id="add_chat_button">
|
||||||
<property name="tooltip-text" translatable="yes">New Chat</property>
|
<property name="action-name">app.new_chat</property>
|
||||||
<property name="icon-name">chat-message-new-symbolic</property>
|
<property name="tooltip-text" translatable="yes">New Chat</property>
|
||||||
<style>
|
<property name="icon-name">chat-message-new-symbolic</property>
|
||||||
<class name="flat"/>
|
<style>
|
||||||
</style>
|
<class name="flat"/>
|
||||||
</object>
|
</style>
|
||||||
</child>
|
|
||||||
<child type="end">
|
|
||||||
<object class="GtkMenuButton">
|
|
||||||
<property name="primary">True</property>
|
|
||||||
<property name="icon-name">open-menu-symbolic</property>
|
|
||||||
<property name="tooltip-text" translatable="yes">Menu</property>
|
|
||||||
<property name="menu-model">primary_menu</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<property name="content">
|
<child type="end">
|
||||||
<object class="GtkScrolledWindow">
|
<object class="GtkMenuButton">
|
||||||
<property name="vexpand">true</property>
|
<property name="primary">True</property>
|
||||||
<property name="hexpand">true</property>
|
<property name="icon-name">open-menu-symbolic</property>
|
||||||
<child>
|
<property name="tooltip-text" translatable="yes">Menu</property>
|
||||||
<object class="GtkListBox" id="chat_list_box">
|
<property name="menu-model">primary_menu</property>
|
||||||
<signal name="row-selected" handler="chat_changed"/>
|
|
||||||
<property name="selection-mode">single</property>
|
|
||||||
<style>
|
|
||||||
<class name="navigation-sidebar"/>
|
|
||||||
</style>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
</object>
|
||||||
</property>
|
</child>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
<property name="content">
|
||||||
|
<object class="GtkScrolledWindow" id="chat_list_container">
|
||||||
|
<property name="vexpand">true</property>
|
||||||
|
<property name="hexpand">true</property>
|
||||||
</object>
|
</object>
|
||||||
</property>
|
</property>
|
||||||
<child>
|
</object>
|
||||||
<object class="AdwToolbarView">
|
</property>
|
||||||
<child type="top">
|
<child>
|
||||||
<object class="AdwHeaderBar" id="header_bar">
|
<object class="AdwToolbarView">
|
||||||
<child type="start">
|
<child type="top">
|
||||||
<object class="GtkToggleButton" id="show_sidebar_button">
|
<object class="AdwHeaderBar" id="header_bar">
|
||||||
<property name="icon-name">sidebar-show-symbolic</property>
|
<child type="start">
|
||||||
<property name="tooltip-text" translatable="yes">Toggle Sidebar</property>
|
<object class="GtkToggleButton" id="show_sidebar_button">
|
||||||
<property name="active" bind-source="split_view_overlay" bind-property="show-sidebar" bind-flags="sync-create"/>
|
<property name="icon-name">sidebar-show-symbolic</property>
|
||||||
</object>
|
<property name="tooltip-text" translatable="yes">Toggle Sidebar</property>
|
||||||
</child>
|
<property name="active" bind-source="split_view_overlay" bind-property="show-sidebar" bind-flags="sync-create"/>
|
||||||
<property name="title-widget">
|
|
||||||
<object class="GtkBox">
|
|
||||||
<property name="orientation">0</property>
|
|
||||||
<property name="spacing">12</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkMenuButton" id="model_selector_button">
|
|
||||||
<property name="tooltip-text" translatable="yes">Select Model</property>
|
|
||||||
<property name="child">
|
|
||||||
<object class="GtkBox">
|
|
||||||
<property name="spacing">10</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkLabel">
|
|
||||||
<property name="label" translatable="yes">Select a Model</property>
|
|
||||||
<property name="ellipsize">2</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="GtkImage">
|
|
||||||
<property name="icon-name">down-symbolic</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
<property name="halign">1</property>
|
|
||||||
<style>
|
|
||||||
<class name="raised"/>
|
|
||||||
</style>
|
|
||||||
<property name="popover">
|
|
||||||
<object class="GtkPopover" id="model_popover">
|
|
||||||
<style>
|
|
||||||
<class name="model_popover"/>
|
|
||||||
</style>
|
|
||||||
<property name="has-arrow">false</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkBox">
|
|
||||||
<property name="orientation">1</property>
|
|
||||||
<property name="spacing">5</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkButton">
|
|
||||||
<property name="child">
|
|
||||||
<object class="GtkLabel">
|
|
||||||
<property name="label" translatable="yes">Manage Models</property>
|
|
||||||
<property name="justify">left</property>
|
|
||||||
<property name="halign">1</property>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
<property name="hexpand">true</property>
|
|
||||||
<property name="tooltip-text" translatable="yes">Manage Models</property>
|
|
||||||
<property name="action-name">app.manage_models</property>
|
|
||||||
<signal name="clicked" handler="close_model_popup"/>
|
|
||||||
<style>
|
|
||||||
<class name="flat"/>
|
|
||||||
<class name="manage_models_button"/>
|
|
||||||
</style>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="GtkSeparator"/>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="GtkScrolledWindow">
|
|
||||||
<property name="max-content-height">300</property>
|
|
||||||
<property name="propagate-natural-width">true</property>
|
|
||||||
<property name="propagate-natural-height">true</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkListBox" id="model_list_box">
|
|
||||||
<property name="hexpand">true</property>
|
|
||||||
<style>
|
|
||||||
<class name="navigation-sidebar"/>
|
|
||||||
<class name="model_list_box"/>
|
|
||||||
</style>
|
|
||||||
<signal name="row-selected" handler="change_model"/>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
<child type="end">
|
|
||||||
<object class="GtkMenuButton" id="secondary_menu_button">
|
|
||||||
<property name="primary">False</property>
|
|
||||||
<property name="icon-name">view-more-symbolic</property>
|
|
||||||
<property name="tooltip-text" translatable="yes">Chat Menu</property>
|
|
||||||
<property name="menu-model">secondary_menu</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<property name="content">
|
<child type="end">
|
||||||
<object class="GtkBox"><!--ACTUAL CONTENT-->
|
<object class="GtkMenuButton" id="secondary_menu_button">
|
||||||
<property name="orientation">1</property>
|
<property name="primary">False</property>
|
||||||
<property name="vexpand">true</property>
|
<property name="icon-name">view-more-symbolic</property>
|
||||||
|
<property name="tooltip-text" translatable="yes">Chat Menu</property>
|
||||||
|
<property name="menu-model">secondary_menu</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
<property name="content">
|
||||||
|
<object class="GtkBox"><!--ACTUAL CONTENT-->
|
||||||
|
<property name="orientation">1</property>
|
||||||
|
<property name="vexpand">true</property>
|
||||||
|
<property name="hexpand">true</property>
|
||||||
|
<child>
|
||||||
|
<object class="AdwBanner" id="banner">
|
||||||
|
<property name="button-label" translatable="true">Close</property>
|
||||||
|
<property name="title" translatable="yes">Warning: Power saver mode is enabled, this will slow down message generation</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
<child>
|
||||||
|
<object class="AdwToastOverlay" id="main_overlay">
|
||||||
|
<child>
|
||||||
|
<object class="GtkStack" id="chat_stack">
|
||||||
<property name="hexpand">true</property>
|
<property name="hexpand">true</property>
|
||||||
|
<property name="vexpand">true</property>
|
||||||
|
<property name="hhomogeneous">true</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
<child>
|
||||||
|
<object class="AdwClamp">
|
||||||
|
<property name="maximum-size">1000</property>
|
||||||
|
<property name="tightening-threshold">800</property>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwToastOverlay" id="main_overlay">
|
<object class="GtkBox">
|
||||||
<child>
|
<property name="orientation">1</property>
|
||||||
<object class="GtkScrolledWindow" id="chat_window">
|
<property name="spacing">12</property>
|
||||||
<property name="propagate-natural-height">true</property>
|
<property name="margin-top">12</property>
|
||||||
<property name="kinetic-scrolling">true</property>
|
<property name="margin-bottom">12</property>
|
||||||
<property name="vexpand">true</property>
|
<property name="margin-start">12</property>
|
||||||
<style>
|
<property name="margin-end">12</property>
|
||||||
<class name="undershoot-bottom"/>
|
|
||||||
</style>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwClamp">
|
<object class="GtkScrolledWindow" id="attachment_box">
|
||||||
<property name="maximum-size">1000</property>
|
<property name="visible">false</property>
|
||||||
<property name="tightening-threshold">800</property>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkBox" id="chat_container">
|
<object class="GtkBox" id="attachment_container">
|
||||||
<property name="orientation">1</property>
|
<property name="orientation">0</property>
|
||||||
<property name="homogeneous">false</property>
|
<property name="vexpand">false</property>
|
||||||
<property name="hexpand">true</property>
|
|
||||||
<property name="vexpand">true</property>
|
|
||||||
<property name="spacing">12</property>
|
<property name="spacing">12</property>
|
||||||
<property name="margin-top">12</property>
|
|
||||||
<property name="margin-bottom">12</property>
|
|
||||||
<property name="margin-start">12</property>
|
|
||||||
<property name="margin-end">12</property>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="AdwClamp">
|
|
||||||
<property name="maximum-size">1000</property>
|
|
||||||
<property name="tightening-threshold">800</property>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkBox">
|
<object class="GtkBox">
|
||||||
<property name="orientation">1</property>
|
<property name="orientation">0</property>
|
||||||
<property name="spacing">12</property>
|
<property name="spacing">12</property>
|
||||||
<property name="margin-top">12</property>
|
|
||||||
<property name="margin-bottom">12</property>
|
|
||||||
<property name="margin-start">12</property>
|
|
||||||
<property name="margin-end">12</property>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkScrolledWindow" id="attachment_box">
|
<object class="GtkButton" id="attachment_button">
|
||||||
<property name="visible">false</property>
|
<property name="vexpand">false</property>
|
||||||
|
<property name="valign">3</property>
|
||||||
|
<property name="tooltip-text" translatable="yes">Attach File</property>
|
||||||
|
<style>
|
||||||
|
<class name="circular"/>
|
||||||
|
</style>
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkBox" id="attachment_container">
|
<object class="AdwButtonContent">
|
||||||
<property name="orientation">0</property>
|
<property name="icon-name">chain-link-loose-symbolic</property>
|
||||||
<property name="vexpand">false</property>
|
|
||||||
<property name="spacing">12</property>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkBox">
|
<object class="GtkBox">
|
||||||
<property name="orientation">0</property>
|
<style>
|
||||||
<property name="spacing">12</property>
|
<class name="card"/>
|
||||||
|
</style>
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkButton" id="attachment_button">
|
<object class="GtkScrolledWindow">
|
||||||
<property name="vexpand">false</property>
|
<property name="max-content-height">150</property>
|
||||||
<property name="valign">3</property>
|
<property name="propagate-natural-height">true</property>
|
||||||
<property name="tooltip-text" translatable="yes">Attach File</property>
|
<property name="margin-start">10</property>
|
||||||
|
<property name="margin-end">10</property>
|
||||||
<style>
|
<style>
|
||||||
<class name="circular"/>
|
<class name="message_input_scroll_window"/>
|
||||||
</style>
|
</style>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwButtonContent">
|
<object class="GtkTextView" id="message_text_view">
|
||||||
<property name="icon-name">chain-link-loose-symbolic</property>
|
<signal name="paste-clipboard" handler="on_clipboard_paste"/>
|
||||||
|
<style>
|
||||||
|
<class name="message_text_view"/>
|
||||||
|
</style>
|
||||||
|
<property name="wrap-mode">word</property>
|
||||||
|
<property name="top-margin">10</property>
|
||||||
|
<property name="bottom-margin">10</property>
|
||||||
|
<property name="hexpand">true</property>
|
||||||
|
<property name="input-hints">spellcheck</property>
|
||||||
|
<accessibility>
|
||||||
|
<property name="label" translatable="yes">Message text box</property>
|
||||||
|
</accessibility>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
</object>
|
||||||
<object class="GtkBox">
|
</child>
|
||||||
<style>
|
<child>
|
||||||
<class name="card"/>
|
<object class="GtkButton" id="send_button">
|
||||||
</style>
|
<signal name="clicked" handler="send_message"/>
|
||||||
<child>
|
<property name="vexpand">false</property>
|
||||||
<object class="GtkScrolledWindow">
|
<property name="valign">3</property>
|
||||||
<property name="max-content-height">150</property>
|
<property name="tooltip-text" translatable="yes">Send Message</property>
|
||||||
<property name="propagate-natural-height">true</property>
|
<style>
|
||||||
<property name="margin-start">10</property>
|
<class name="accent"/>
|
||||||
<property name="margin-end">10</property>
|
<class name="circular"/>
|
||||||
<style>
|
<class name="suggested-action"/>
|
||||||
<class name="message_input_scroll_window"/>
|
</style>
|
||||||
</style>
|
|
||||||
<child>
|
|
||||||
<object class="GtkTextView" id="message_text_view">
|
|
||||||
<style>
|
|
||||||
<class name="message_text_view"/>
|
|
||||||
</style>
|
|
||||||
<property name="wrap-mode">word</property>
|
|
||||||
<property name="top-margin">10</property>
|
|
||||||
<property name="bottom-margin">10</property>
|
|
||||||
<property name="hexpand">true</property>
|
|
||||||
<property name="input-hints">spellcheck</property>
|
|
||||||
<accessibility>
|
|
||||||
<property name="label" translatable="yes">Message text box</property>
|
|
||||||
</accessibility>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkButton" id="send_button">
|
<object class="AdwButtonContent">
|
||||||
<signal name="clicked" handler="send_message"/>
|
<property name="icon-name">paper-plane-symbolic</property>
|
||||||
<property name="vexpand">false</property>
|
|
||||||
<property name="valign">3</property>
|
|
||||||
<property name="tooltip-text" translatable="yes">Send Message</property>
|
|
||||||
<style>
|
|
||||||
<class name="accent"/>
|
|
||||||
<class name="circular"/>
|
|
||||||
<class name="suggested-action"/>
|
|
||||||
</style>
|
|
||||||
<child>
|
|
||||||
<object class="AdwButtonContent">
|
|
||||||
<property name="icon-name">paper-plane-symbolic</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="GtkButton" id="stop_button">
|
|
||||||
<signal name="clicked" handler="stop_message"/>
|
|
||||||
<property name="vexpand">false</property>
|
|
||||||
<property name="valign">3</property>
|
|
||||||
<property name="visible">false</property>
|
|
||||||
<style>
|
|
||||||
<class name="destructive-action"/>
|
|
||||||
<class name="circular"/>
|
|
||||||
</style>
|
|
||||||
<child>
|
|
||||||
<object class="AdwButtonContent">
|
|
||||||
<property name="icon-name">media-playback-stop-symbolic</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
<child>
|
||||||
|
<object class="GtkButton" id="stop_button">
|
||||||
|
<signal name="clicked" handler="stop_message"/>
|
||||||
|
<property name="vexpand">false</property>
|
||||||
|
<property name="valign">3</property>
|
||||||
|
<property name="visible">false</property>
|
||||||
|
<style>
|
||||||
|
<class name="destructive-action"/>
|
||||||
|
<class name="circular"/>
|
||||||
|
</style>
|
||||||
|
<child>
|
||||||
|
<object class="AdwButtonContent">
|
||||||
|
<property name="icon-name">media-playback-stop-symbolic</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
|
|
||||||
</child>
|
</child>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
|
||||||
|
|
||||||
</object><!--END OF CONTENT-->
|
</object><!--END OF CONTENT-->
|
||||||
</property>
|
</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</property>
|
</property>
|
||||||
<object class="AdwPreferencesDialog" id="preferences_dialog">
|
<object class="AdwPreferencesDialog" id="preferences_dialog">
|
||||||
@@ -343,6 +235,7 @@
|
|||||||
<object class="AdwPreferencesGroup">
|
<object class="AdwPreferencesGroup">
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwSwitchRow" id="remote_connection_switch">
|
<object class="AdwSwitchRow" id="remote_connection_switch">
|
||||||
|
<signal name="notify::active" handler="change_remote_connection"/>
|
||||||
<property name="title" translatable="yes">Use Remote Connection to Ollama</property>
|
<property name="title" translatable="yes">Use Remote Connection to Ollama</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
@@ -366,15 +259,22 @@
|
|||||||
<object class="AdwPreferencesGroup">
|
<object class="AdwPreferencesGroup">
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwSwitchRow" id="background_switch">
|
<object class="AdwSwitchRow" id="background_switch">
|
||||||
|
<signal name="notify::active" handler="switch_run_on_background"/>
|
||||||
<property name="title" translatable="yes">Run Alpaca In Background</property>
|
<property name="title" translatable="yes">Run Alpaca In Background</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
|
<child>
|
||||||
|
<object class="AdwSwitchRow" id="powersaver_warning_switch">
|
||||||
|
<signal name="notify::active" handler="switch_powersaver_warning"/>
|
||||||
|
<property name="title" translatable="yes">Show Power Saver Warning</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwPreferencesGroup">
|
<object class="AdwPreferencesGroup" id="tweaks_group">
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwSpinRow" id="temperature_spin">
|
<object class="AdwSpinRow">
|
||||||
<signal name="changed" handler="model_spin_changed"/>
|
<signal name="changed" handler="model_spin_changed"/>
|
||||||
<property name="name">temperature</property>
|
<property name="name">temperature</property>
|
||||||
<property name="title" translatable="yes">Temperature</property>
|
<property name="title" translatable="yes">Temperature</property>
|
||||||
@@ -390,7 +290,7 @@
|
|||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwSpinRow" id="seed_spin">
|
<object class="AdwSpinRow">
|
||||||
<signal name="changed" handler="model_spin_changed"/>
|
<signal name="changed" handler="model_spin_changed"/>
|
||||||
<property name="name">seed</property>
|
<property name="name">seed</property>
|
||||||
<property name="title" translatable="yes">Seed</property>
|
<property name="title" translatable="yes">Seed</property>
|
||||||
@@ -405,7 +305,7 @@
|
|||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwSpinRow" id="keep_alive_spin">
|
<object class="AdwSpinRow">
|
||||||
<signal name="changed" handler="model_spin_changed"/>
|
<signal name="changed" handler="model_spin_changed"/>
|
||||||
<property name="name">keep_alive</property>
|
<property name="name">keep_alive</property>
|
||||||
<property name="title" translatable="yes">Keep Alive Time</property>
|
<property name="title" translatable="yes">Keep Alive Time</property>
|
||||||
@@ -428,11 +328,11 @@
|
|||||||
<property name="title" translatable="yes">Ollama Instance</property>
|
<property name="title" translatable="yes">Ollama Instance</property>
|
||||||
<property name="icon-name">brain-augemnted-symbolic</property>
|
<property name="icon-name">brain-augemnted-symbolic</property>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwPreferencesGroup">
|
<object class="AdwPreferencesGroup" id="overrides_group">
|
||||||
<property name="title" translatable="yes">Ollama Overrides</property>
|
<property name="title" translatable="yes">Ollama Overrides</property>
|
||||||
<property name="description" translatable="yes">Manage the arguments used on Ollama, any changes on this page only applies to the integrated instance, the instance will restart if you make changes.</property>
|
<property name="description" translatable="yes">Manage the arguments used on Ollama, any changes on this page only applies to the integrated instance, the instance will restart if you make changes.</property>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwEntryRow" id="override_HSA_OVERRIDE_GFX_VERSION">
|
<object class="AdwEntryRow">
|
||||||
<signal name="apply" handler="override_changed"/>
|
<signal name="apply" handler="override_changed"/>
|
||||||
<property name="name">HSA_OVERRIDE_GFX_VERSION</property>
|
<property name="name">HSA_OVERRIDE_GFX_VERSION</property>
|
||||||
<property name="title" translatable="no">HSA_OVERRIDE_GFX_VERSION</property>
|
<property name="title" translatable="no">HSA_OVERRIDE_GFX_VERSION</property>
|
||||||
@@ -453,7 +353,7 @@
|
|||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwEntryRow" id="override_CUDA_VISIBLE_DEVICES">
|
<object class="AdwEntryRow">
|
||||||
<signal name="apply" handler="override_changed"/>
|
<signal name="apply" handler="override_changed"/>
|
||||||
<property name="name">CUDA_VISIBLE_DEVICES</property>
|
<property name="name">CUDA_VISIBLE_DEVICES</property>
|
||||||
<property name="title" translatable="no">CUDA_VISIBLE_DEVICES</property>
|
<property name="title" translatable="no">CUDA_VISIBLE_DEVICES</property>
|
||||||
@@ -474,7 +374,7 @@
|
|||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwEntryRow" id="override_HIP_VISIBLE_DEVICES">
|
<object class="AdwEntryRow">
|
||||||
<signal name="apply" handler="override_changed"/>
|
<signal name="apply" handler="override_changed"/>
|
||||||
<property name="name">HIP_VISIBLE_DEVICES</property>
|
<property name="name">HIP_VISIBLE_DEVICES</property>
|
||||||
<property name="title" translatable="no">HIP_VISIBLE_DEVICES</property>
|
<property name="title" translatable="no">HIP_VISIBLE_DEVICES</property>
|
||||||
@@ -496,6 +396,47 @@
|
|||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
|
<child>
|
||||||
|
<object class="AdwPreferencesGroup">
|
||||||
|
<child>
|
||||||
|
<object class="AdwSpinRow" id="instance_idle_timer">
|
||||||
|
<signal name="changed" handler="instance_idle_timer_changed"/>
|
||||||
|
<property name="name">timer</property>
|
||||||
|
<property name="title" translatable="yes">Idle Timer</property>
|
||||||
|
<property name="subtitle" translatable="yes">Number of minutes the instance should remain idle before it is shut down (0 means it won't be shut down)</property>
|
||||||
|
<property name="digits">0</property>
|
||||||
|
<property name="adjustment">
|
||||||
|
<object class="GtkAdjustment">
|
||||||
|
<property name="lower">0</property>
|
||||||
|
<property name="upper">60</property>
|
||||||
|
<property name="step-increment">5</property>
|
||||||
|
</object>
|
||||||
|
</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
|
|
||||||
|
<object class="AdwDialog" id="launch_dialog">
|
||||||
|
<accessibility>
|
||||||
|
<property name="label" translatable="yes">Loading Alpaca dialog</property>
|
||||||
|
</accessibility>
|
||||||
|
<property name="width-request">400</property>
|
||||||
|
<property name="can-close">false</property>
|
||||||
|
<child>
|
||||||
|
<object class="AdwStatusPage" id="launch_status">
|
||||||
|
<property name="icon_name">com.jeffser.Alpaca</property>
|
||||||
|
<property name="title" translatable="yes">Loading Alpaca...</property>
|
||||||
|
<property name="child">
|
||||||
|
<object class="GtkLevelBar" id="launch_level_bar">
|
||||||
|
<property name="mode">1</property>
|
||||||
|
<property name="min-value">0</property>
|
||||||
|
<property name="max-value">5</property>
|
||||||
|
</object>
|
||||||
|
</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
@@ -555,51 +496,38 @@
|
|||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<property name="content">
|
<property name="content">
|
||||||
<object class="GtkScrolledWindow">
|
<object class="GtkBox">
|
||||||
<property name="hexpand">true</property>
|
<property name="hexpand">true</property>
|
||||||
<property name="vexpand">true</property>
|
<property name="vexpand">true</property>
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkBox">
|
<object class="GtkScrolledWindow" id="model_scroller">
|
||||||
<property name="margin-start">12</property>
|
<property name="hexpand">true</property>
|
||||||
<property name="margin-end">12</property>
|
<property name="vexpand">true</property>
|
||||||
<property name="margin-top">12</property>
|
</object>
|
||||||
<property name="margin-bottom">12</property>
|
</child>
|
||||||
<property name="orientation">1</property>
|
<child>
|
||||||
<property name="spacing">12</property>
|
<object class="AdwStatusPage" id="no_results_page">
|
||||||
<child>
|
<property name="visible">false</property>
|
||||||
<object class="GtkListBox" id="pulling_model_list_box">
|
<property name="vexpand">true</property>
|
||||||
<property name="visible">false</property>
|
<property name="hexpand">true</property>
|
||||||
<property name="selection-mode">none</property>
|
<property name="icon-name">edit-find-symbolic</property>
|
||||||
|
<property name="title" translatable="yes">No Models Found</property>
|
||||||
|
<property name="description" translatable="yes">Try a different search or pull an unlisted model from it's name</property>
|
||||||
|
<property name="child">
|
||||||
|
<object class="GtkButton">
|
||||||
|
<property name="tooltip-text">Pull Model From Name</property>
|
||||||
|
<property name="action-name">app.create_model_from_name</property>
|
||||||
|
<property name="halign">center</property>
|
||||||
|
<property name="child">
|
||||||
|
<object class="GtkLabel">
|
||||||
|
<property name="label" translatable="yes">Pull Model From Name</property>
|
||||||
|
</object>
|
||||||
|
</property>
|
||||||
<style>
|
<style>
|
||||||
<class name="boxed-list"/>
|
<class name="suggested-action"/>
|
||||||
</style>
|
</style>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</property>
|
||||||
<child>
|
|
||||||
<object class="GtkListBox" id="local_model_list_box">
|
|
||||||
<property name="selection-mode">none</property>
|
|
||||||
<style>
|
|
||||||
<class name="boxed-list"/>
|
|
||||||
</style>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="GtkListBox" id="available_model_list_box">
|
|
||||||
<property name="selection-mode">none</property>
|
|
||||||
<style>
|
|
||||||
<class name="boxed-list"/>
|
|
||||||
</style>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="AdwStatusPage" id="no_results_page">
|
|
||||||
<property name="visible">false</property>
|
|
||||||
<property name="vexpand">true</property>
|
|
||||||
<property name="icon-name">edit-find-symbolic</property>
|
|
||||||
<property name="title" translatable="yes">No Models Found</property>
|
|
||||||
<property name="description" translatable="yes">Try a different search</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
@@ -643,6 +571,9 @@
|
|||||||
<property name="margin-start">12</property>
|
<property name="margin-start">12</property>
|
||||||
<property name="margin-end">12</property>
|
<property name="margin-end">12</property>
|
||||||
<property name="label" translatable="yes">By downloading this model you accept the license agreement available on the model's website.</property>
|
<property name="label" translatable="yes">By downloading this model you accept the license agreement available on the model's website.</property>
|
||||||
|
<style>
|
||||||
|
<class name="dim-label"/>
|
||||||
|
</style>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
@@ -1139,6 +1070,12 @@
|
|||||||
<property name="title" translatable="yes">Toggle sidebar</property>
|
<property name="title" translatable="yes">Toggle sidebar</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
|
<child>
|
||||||
|
<object class="GtkShortcutsShortcut">
|
||||||
|
<property name="accelerator">F2</property>
|
||||||
|
<property name="title" translatable="yes">Rename chat</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
@@ -1175,3 +1112,4 @@
|
|||||||
</object>
|
</object>
|
||||||
</interface>
|
</interface>
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,15 +0,0 @@
|
|||||||
"""
|
|
||||||
Moves the descriptions of models to src/available_models_descriptions.py
|
|
||||||
so they can be translated
|
|
||||||
"""
|
|
||||||
import json
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
with open('src/available_models.json', 'r', encoding="utf-8") as f:
|
|
||||||
data = json.load(f)
|
|
||||||
RESULTS = 'descriptions = {\n'
|
|
||||||
for key, value in data.items():
|
|
||||||
RESULTS += f" '{key}': _(\"{value['description']}\"),\n"
|
|
||||||
RESULTS += '}'
|
|
||||||
with open('src/available_models_descriptions.py', 'w+', encoding="utf-8") as f:
|
|
||||||
f.write(RESULTS)
|
|
||||||
@@ -15,6 +15,12 @@ msgmerge --no-fuzzy-matching -U po/nb_NO.po po/alpaca.pot
|
|||||||
echo "Updating Bengali"
|
echo "Updating Bengali"
|
||||||
msgmerge --no-fuzzy-matching -U po/bn.po po/alpaca.pot
|
msgmerge --no-fuzzy-matching -U po/bn.po po/alpaca.pot
|
||||||
echo "Updating Simplified Chinese"
|
echo "Updating Simplified Chinese"
|
||||||
msgmerge --no-fuzzy-matching -U po/zh_CN.po po/alpaca.pot
|
msgmerge --no-fuzzy-matching -U po/zh_Hans.po po/alpaca.pot
|
||||||
echo "Updating Hindi"
|
echo "Updating Hindi"
|
||||||
msgmerge --no-fuzzy-matching -U po/hi.po po/alpaca.pot
|
msgmerge --no-fuzzy-matching -U po/hi.po po/alpaca.pot
|
||||||
|
echo "Updating Turkish"
|
||||||
|
msgmerge --no-fuzzy-matching -U po/tr.po po/alpaca.pot
|
||||||
|
echo "Updating Ukrainian"
|
||||||
|
msgmerge --no-fuzzy-matching -U po/uk.po po/alpaca.pot
|
||||||
|
echo "Updating German"
|
||||||
|
msgmerge --no-fuzzy-matching -U po/de.po po/alpaca.pot
|
||||||
|
|||||||
Reference in New Issue
Block a user