Compare commits
No commits in common. "main" and "2.0.2" have entirely different histories.
18
README.md
18
README.md
@ -8,6 +8,9 @@ Alpaca is an [Ollama](https://github.com/ollama/ollama) client where you can man
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
> [!NOTE]
|
||||||
|
> Please checkout [this discussion](https://github.com/Jeffser/Alpaca/discussions/292), I want to start developing a new app alongside Alpaca but I need some suggestions, thanks!
|
||||||
|
|
||||||
> [!WARNING]
|
> [!WARNING]
|
||||||
> This project is not affiliated at all with Ollama, I'm not responsible for any damages to your device or software caused by running code given by any AI models.
|
> This project is not affiliated at all with Ollama, I'm not responsible for any damages to your device or software caused by running code given by any AI models.
|
||||||
|
|
||||||
@ -33,7 +36,7 @@ Alpaca is an [Ollama](https://github.com/ollama/ollama) client where you can man
|
|||||||
|
|
||||||
Normal conversation | Image recognition | Code highlighting | YouTube transcription | Model management
|
Normal conversation | Image recognition | Code highlighting | YouTube transcription | Model management
|
||||||
:------------------:|:-----------------:|:-----------------:|:---------------------:|:----------------:
|
:------------------:|:-----------------:|:-----------------:|:---------------------:|:----------------:
|
||||||
 |  |  |  | 
|
 |  |  |  | 
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
@ -45,14 +48,6 @@ You can find the latest stable version of the app on [Flathub](https://flathub.o
|
|||||||
|
|
||||||
Everytime a new version is published they become available on the [releases page](https://github.com/Jeffser/Alpaca/releases) of the repository
|
Everytime a new version is published they become available on the [releases page](https://github.com/Jeffser/Alpaca/releases) of the repository
|
||||||
|
|
||||||
### Snap Package
|
|
||||||
|
|
||||||
You can also find the Snap package on the [releases page](https://github.com/Jeffser/Alpaca/releases), to install it run this command:
|
|
||||||
```BASH
|
|
||||||
sudo snap install ./{package name} --dangerous
|
|
||||||
```
|
|
||||||
The `--dangerous` comes from the package being installed without any involvement of the SnapStore, I'm working on getting the app there, but for now you can test the app this way.
|
|
||||||
|
|
||||||
### Building Git Version
|
### Building Git Version
|
||||||
|
|
||||||
Note: This is not recommended since the prerelease versions of the app often present errors and general instability.
|
Note: This is not recommended since the prerelease versions of the app often present errors and general instability.
|
||||||
@ -68,7 +63,7 @@ Language | Contributors
|
|||||||
🇷🇺 Russian | [Alex K](https://github.com/alexkdeveloper)
|
🇷🇺 Russian | [Alex K](https://github.com/alexkdeveloper)
|
||||||
🇪🇸 Spanish | [Jeffry Samuel](https://github.com/jeffser)
|
🇪🇸 Spanish | [Jeffry Samuel](https://github.com/jeffser)
|
||||||
🇫🇷 French | [Louis Chauvet-Villaret](https://github.com/loulou64490) , [Théo FORTIN](https://github.com/topiga)
|
🇫🇷 French | [Louis Chauvet-Villaret](https://github.com/loulou64490) , [Théo FORTIN](https://github.com/topiga)
|
||||||
🇧🇷 Brazilian Portuguese | [Daimar Stein](https://github.com/not-a-dev-stein) , [Bruno Antunes](https://github.com/antun3s)
|
🇧🇷 Brazilian Portuguese | [Daimar Stein](https://github.com/not-a-dev-stein)
|
||||||
🇳🇴 Norwegian | [CounterFlow64](https://github.com/CounterFlow64)
|
🇳🇴 Norwegian | [CounterFlow64](https://github.com/CounterFlow64)
|
||||||
🇮🇳 Bengali | [Aritra Saha](https://github.com/olumolu)
|
🇮🇳 Bengali | [Aritra Saha](https://github.com/olumolu)
|
||||||
🇨🇳 Simplified Chinese | [Yuehao Sui](https://github.com/8ar10der) , [Aleksana](https://github.com/Aleksanaa)
|
🇨🇳 Simplified Chinese | [Yuehao Sui](https://github.com/8ar10der) , [Aleksana](https://github.com/Aleksanaa)
|
||||||
@ -76,8 +71,6 @@ Language | Contributors
|
|||||||
🇹🇷 Turkish | [YusaBecerikli](https://github.com/YusaBecerikli)
|
🇹🇷 Turkish | [YusaBecerikli](https://github.com/YusaBecerikli)
|
||||||
🇺🇦 Ukrainian | [Simon](https://github.com/OriginalSimon)
|
🇺🇦 Ukrainian | [Simon](https://github.com/OriginalSimon)
|
||||||
🇩🇪 German | [Marcel Margenberg](https://github.com/MehrzweckMandala)
|
🇩🇪 German | [Marcel Margenberg](https://github.com/MehrzweckMandala)
|
||||||
🇮🇱 Hebrew | [Yosef Or Boczko](https://github.com/yoseforb)
|
|
||||||
🇮🇳 Telugu | [Aryan Karamtoth](https://github.com/SpaciousCoder78)
|
|
||||||
|
|
||||||
Want to add a language? Visit [this discussion](https://github.com/Jeffser/Alpaca/discussions/153) to get started!
|
Want to add a language? Visit [this discussion](https://github.com/Jeffser/Alpaca/discussions/153) to get started!
|
||||||
|
|
||||||
@ -91,7 +84,6 @@ Want to add a language? Visit [this discussion](https://github.com/Jeffser/Alpac
|
|||||||
- [Nokse](https://github.com/Nokse22) for their contributions to the UI and table rendering
|
- [Nokse](https://github.com/Nokse22) for their contributions to the UI and table rendering
|
||||||
- [Louis Chauvet-Villaret](https://github.com/loulou64490) for their suggestions
|
- [Louis Chauvet-Villaret](https://github.com/loulou64490) for their suggestions
|
||||||
- [Aleksana](https://github.com/Aleksanaa) for her help with better handling of directories
|
- [Aleksana](https://github.com/Aleksanaa) for her help with better handling of directories
|
||||||
- [Gnome Builder Team](https://gitlab.gnome.org/GNOME/gnome-builder) for the awesome IDE I use to develop Alpaca
|
|
||||||
- Sponsors for giving me enough money to be able to take a ride to my campus every time I need to <3
|
- Sponsors for giving me enough money to be able to take a ride to my campus every time I need to <3
|
||||||
- Everyone that has shared kind words of encouragement!
|
- Everyone that has shared kind words of encouragement!
|
||||||
|
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"id" : "com.jeffser.Alpaca",
|
"id" : "com.jeffser.Alpaca",
|
||||||
"runtime" : "org.gnome.Platform",
|
"runtime" : "org.gnome.Platform",
|
||||||
"runtime-version" : "47",
|
"runtime-version" : "46",
|
||||||
"sdk" : "org.gnome.Sdk",
|
"sdk" : "org.gnome.Sdk",
|
||||||
"command" : "alpaca",
|
"command" : "alpaca",
|
||||||
"finish-args" : [
|
"finish-args" : [
|
||||||
@ -11,8 +11,7 @@
|
|||||||
"--device=all",
|
"--device=all",
|
||||||
"--socket=wayland",
|
"--socket=wayland",
|
||||||
"--filesystem=/sys/module/amdgpu:ro",
|
"--filesystem=/sys/module/amdgpu:ro",
|
||||||
"--env=LD_LIBRARY_PATH=/app/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/sdk/llvm15/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/ollama:/app/plugins/AMD/lib/ollama",
|
"--env=LD_LIBRARY_PATH=/app/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/sdk/llvm15/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/ollama:/app/plugins/AMD/lib/ollama"
|
||||||
"--env=GSK_RENDERER=ngl"
|
|
||||||
],
|
],
|
||||||
"add-extensions": {
|
"add-extensions": {
|
||||||
"com.jeffser.Alpaca.Plugins": {
|
"com.jeffser.Alpaca.Plugins": {
|
||||||
@ -111,45 +110,6 @@
|
|||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"name": "python3-youtube-transcript-api",
|
|
||||||
"buildsystem": "simple",
|
|
||||||
"build-commands": [
|
|
||||||
"pip3 install --verbose --exists-action=i --no-index --find-links=\"file://${PWD}\" --prefix=${FLATPAK_DEST} \"youtube-transcript-api\" --no-build-isolation"
|
|
||||||
],
|
|
||||||
"sources": [
|
|
||||||
{
|
|
||||||
"type": "file",
|
|
||||||
"url": "https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl",
|
|
||||||
"sha256": "922820b53db7a7257ffbda3f597266d435245903d80737e34f8a45ff3e3230d8"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "file",
|
|
||||||
"url": "https://files.pythonhosted.org/packages/f2/4f/e1808dc01273379acc506d18f1504eb2d299bd4131743b9fc54d7be4df1e/charset_normalizer-3.4.0.tar.gz",
|
|
||||||
"sha256": "223217c3d4f82c3ac5e29032b3f1c2eb0fb591b72161f86d93f5719079dae93e"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "file",
|
|
||||||
"url": "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl",
|
|
||||||
"sha256": "946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "file",
|
|
||||||
"url": "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl",
|
|
||||||
"sha256": "70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "file",
|
|
||||||
"url": "https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl",
|
|
||||||
"sha256": "ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "file",
|
|
||||||
"url": "https://files.pythonhosted.org/packages/52/42/5f57d37d56bdb09722f226ed81cc1bec63942da745aa27266b16b0e16a5d/youtube_transcript_api-0.6.2-py3-none-any.whl",
|
|
||||||
"sha256": "019dbf265c6a68a0591c513fff25ed5a116ce6525832aefdfb34d4df5567121c"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"name": "python3-html2text",
|
"name": "python3-html2text",
|
||||||
"buildsystem": "simple",
|
"buildsystem": "simple",
|
||||||
@ -174,16 +134,16 @@
|
|||||||
"sources": [
|
"sources": [
|
||||||
{
|
{
|
||||||
"type": "archive",
|
"type": "archive",
|
||||||
"url": "https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-amd64.tgz",
|
"url": "https://github.com/ollama/ollama/releases/download/v0.3.9/ollama-linux-amd64.tgz",
|
||||||
"sha256": "f0efa42f7ad77cd156bd48c40cd22109473801e5113173b0ad04f094a4ef522b",
|
"sha256": "b0062fbccd46134818d9d59cfa3867ad6849163653cb1171bc852c5f379b0851",
|
||||||
"only-arches": [
|
"only-arches": [
|
||||||
"x86_64"
|
"x86_64"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "archive",
|
"type": "archive",
|
||||||
"url": "https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-arm64.tgz",
|
"url": "https://github.com/ollama/ollama/releases/download/v0.3.9/ollama-linux-arm64.tgz",
|
||||||
"sha256": "da631cbe4dd2c168dae58d6868b1ff60e881e050f2d07578f2f736e689fec04c",
|
"sha256": "8979484bcb1448ab9b45107fbcb3b9f43c2af46f961487449b9ebf3518cd70eb",
|
||||||
"only-arches": [
|
"only-arches": [
|
||||||
"aarch64"
|
"aarch64"
|
||||||
]
|
]
|
||||||
@ -206,18 +166,6 @@
|
|||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"name": "vte",
|
|
||||||
"buildsystem": "meson",
|
|
||||||
"config-opts": ["-Dvapi=false"],
|
|
||||||
"sources": [
|
|
||||||
{
|
|
||||||
"type": "archive",
|
|
||||||
"url": "https://gitlab.gnome.org/GNOME/vte/-/archive/0.78.0/vte-0.78.0.tar.gz",
|
|
||||||
"sha256": "82e19d11780fed4b66400f000829ce5ca113efbbfb7975815f26ed93e4c05f2d"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"name" : "alpaca",
|
"name" : "alpaca",
|
||||||
"builddir" : true,
|
"builddir" : true,
|
||||||
|
@ -63,14 +63,10 @@
|
|||||||
</screenshot>
|
</screenshot>
|
||||||
<screenshot>
|
<screenshot>
|
||||||

|

|
||||||
<caption>A Python script running inside integrated terminal</caption>
|
|
||||||
</screenshot>
|
|
||||||
<screenshot>
|
|
||||||

|
|
||||||
<caption>A conversation involving a YouTube video transcript</caption>
|
<caption>A conversation involving a YouTube video transcript</caption>
|
||||||
</screenshot>
|
</screenshot>
|
||||||
<screenshot>
|
<screenshot>
|
||||||

|

|
||||||
<caption>Multiple models being downloaded</caption>
|
<caption>Multiple models being downloaded</caption>
|
||||||
</screenshot>
|
</screenshot>
|
||||||
</screenshots>
|
</screenshots>
|
||||||
@ -82,133 +78,6 @@
|
|||||||
<url type="contribute">https://github.com/Jeffser/Alpaca/discussions/154</url>
|
<url type="contribute">https://github.com/Jeffser/Alpaca/discussions/154</url>
|
||||||
<url type="vcs-browser">https://github.com/Jeffser/Alpaca</url>
|
<url type="vcs-browser">https://github.com/Jeffser/Alpaca</url>
|
||||||
<releases>
|
<releases>
|
||||||
<release version="2.7.0" date="2024-10-15">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.7.0</url>
|
|
||||||
<description>
|
|
||||||
<p>New</p>
|
|
||||||
<ul>
|
|
||||||
<li>User messages are now compacted into bubbles</li>
|
|
||||||
</ul>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Fixed re connection dialog not working when 'use local instance' is selected</li>
|
|
||||||
<li>Fixed model manager not adapting to large system fonts</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.6.5" date="2024-10-13">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.6.5</url>
|
|
||||||
<description>
|
|
||||||
<p>New</p>
|
|
||||||
<ul>
|
|
||||||
<li>Details page for models</li>
|
|
||||||
<li>Model selector gets replaced with 'manage models' button when there are no models downloaded</li>
|
|
||||||
<li>Added warning when model is too big for the device</li>
|
|
||||||
<li>Added AMD GPU indicator in preferences</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.6.0" date="2024-10-11">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.6.0</url>
|
|
||||||
<description>
|
|
||||||
<p>New</p>
|
|
||||||
<ul>
|
|
||||||
<li>Better system for handling dialogs</li>
|
|
||||||
<li>Better system for handling instance switching</li>
|
|
||||||
<li>Remote connection dialog</li>
|
|
||||||
</ul>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Fixed: Models get duplicated when switching remote and local instance</li>
|
|
||||||
<li>Better internal instance manager</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.5.1" date="2024-10-09">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.5.1</url>
|
|
||||||
<description>
|
|
||||||
<p>New</p>
|
|
||||||
<ul>
|
|
||||||
<li>Added 'Cancel' and 'Save' buttons when editing a message</li>
|
|
||||||
</ul>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Better handling of image recognition</li>
|
|
||||||
<li>Remove unused files when canceling a model download</li>
|
|
||||||
<li>Better message blocks rendering</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.5.0" date="2024-10-06">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.5.0</url>
|
|
||||||
<description>
|
|
||||||
<p>New</p>
|
|
||||||
<ul>
|
|
||||||
<li>Run bash and python scripts straight from chat</li>
|
|
||||||
<li>Updated Ollama to 0.3.12</li>
|
|
||||||
<li>New models!</li>
|
|
||||||
</ul>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Fixed and made faster the launch sequence</li>
|
|
||||||
<li>Better detection of code blocks in messages</li>
|
|
||||||
<li>Fixed app not loading in certain setups with Nvidia GPUs</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.0.6" date="2024-09-29">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.6</url>
|
|
||||||
<description>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Fixed message notification sometimes crashing text rendering because of them running on different threads</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.0.5" date="2024-09-25">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.5</url>
|
|
||||||
<description>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Fixed message generation sometimes failing</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.0.4" date="2024-09-22">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.4</url>
|
|
||||||
<description>
|
|
||||||
<p>New</p>
|
|
||||||
<ul>
|
|
||||||
<li>Sidebar resizes with the window</li>
|
|
||||||
<li>New welcome dialog</li>
|
|
||||||
<li>Message search</li>
|
|
||||||
<li>Updated Ollama to v0.3.11</li>
|
|
||||||
<li>A lot of new models provided by Ollama repository</li>
|
|
||||||
</ul>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Fixed text inside model manager when the accessibility option 'large text' is on</li>
|
|
||||||
<li>Fixed image recognition on unsupported models</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.0.3" date="2024-09-18">
|
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.3</url>
|
|
||||||
<description>
|
|
||||||
<p>Fixes</p>
|
|
||||||
<ul>
|
|
||||||
<li>Fixed spinner not hiding if the back end fails</li>
|
|
||||||
<li>Fixed image recognition with local images</li>
|
|
||||||
<li>Changed appearance of delete / stop model buttons</li>
|
|
||||||
<li>Fixed stop button crashing the app</li>
|
|
||||||
</ul>
|
|
||||||
<p>New</p>
|
|
||||||
<ul>
|
|
||||||
<li>Made sidebar resize a little when the window is smaller</li>
|
|
||||||
<li>Instant launch</li>
|
|
||||||
</ul>
|
|
||||||
</description>
|
|
||||||
</release>
|
|
||||||
<release version="2.0.2" date="2024-09-11">
|
<release version="2.0.2" date="2024-09-11">
|
||||||
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.2</url>
|
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.2</url>
|
||||||
<description>
|
<description>
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
project('Alpaca', 'c',
|
project('Alpaca', 'c',
|
||||||
version: '2.7.0',
|
version: '2.0.2',
|
||||||
meson_version: '>= 0.62.0',
|
meson_version: '>= 0.62.0',
|
||||||
default_options: [ 'warning_level=2', 'werror=false', ],
|
default_options: [ 'warning_level=2', 'werror=false', ],
|
||||||
)
|
)
|
||||||
|
@ -9,5 +9,3 @@ hi
|
|||||||
tr
|
tr
|
||||||
uk
|
uk
|
||||||
de
|
de
|
||||||
he
|
|
||||||
te
|
|
||||||
|
@ -5,11 +5,9 @@ src/main.py
|
|||||||
src/window.py
|
src/window.py
|
||||||
src/available_models_descriptions.py
|
src/available_models_descriptions.py
|
||||||
src/connection_handler.py
|
src/connection_handler.py
|
||||||
|
src/dialogs.py
|
||||||
src/window.ui
|
src/window.ui
|
||||||
src/generic_actions.py
|
|
||||||
src/custom_widgets/chat_widget.py
|
src/custom_widgets/chat_widget.py
|
||||||
src/custom_widgets/message_widget.py
|
src/custom_widgets/message_widget.py
|
||||||
src/custom_widgets/model_widget.py
|
src/custom_widgets/model_widget.py
|
||||||
src/custom_widgets/table_widget.py
|
src/custom_widgets/table_widget.py
|
||||||
src/custom_widgets/dialog_widget.py
|
|
||||||
src/custom_widgets/terminal_widget.py
|
|
2139
po/alpaca.pot
2139
po/alpaca.pot
File diff suppressed because it is too large
Load Diff
2180
po/nb_NO.po
2180
po/nb_NO.po
File diff suppressed because it is too large
Load Diff
2680
po/pt_BR.po
2680
po/pt_BR.po
File diff suppressed because it is too large
Load Diff
2259
po/zh_Hans.po
2259
po/zh_Hans.po
File diff suppressed because it is too large
Load Diff
@ -1,93 +0,0 @@
|
|||||||
name: jeffser-alpaca
|
|
||||||
base: core24
|
|
||||||
adopt-info: alpaca
|
|
||||||
|
|
||||||
platforms:
|
|
||||||
amd64:
|
|
||||||
arm64:
|
|
||||||
|
|
||||||
confinement: strict
|
|
||||||
grade: stable
|
|
||||||
compression: lzo
|
|
||||||
|
|
||||||
slots:
|
|
||||||
dbus-alpaca:
|
|
||||||
interface: dbus
|
|
||||||
bus: session
|
|
||||||
name: com.jeffser.Alpaca
|
|
||||||
|
|
||||||
apps:
|
|
||||||
alpaca:
|
|
||||||
command: usr/bin/alpaca
|
|
||||||
common-id: com.jeffser.Alpaca
|
|
||||||
extensions:
|
|
||||||
- gnome
|
|
||||||
plugs:
|
|
||||||
- network
|
|
||||||
- network-bind
|
|
||||||
- home
|
|
||||||
- removable-media
|
|
||||||
|
|
||||||
ollama:
|
|
||||||
command: bin/ollama
|
|
||||||
plugs:
|
|
||||||
- home
|
|
||||||
- removable-media
|
|
||||||
- network
|
|
||||||
- network-bind
|
|
||||||
|
|
||||||
ollama-daemon:
|
|
||||||
command: bin/ollama serve
|
|
||||||
daemon: simple
|
|
||||||
install-mode: enable
|
|
||||||
restart-condition: on-failure
|
|
||||||
plugs:
|
|
||||||
- home
|
|
||||||
- removable-media
|
|
||||||
- network
|
|
||||||
- network-bind
|
|
||||||
|
|
||||||
parts:
|
|
||||||
# Python dependencies
|
|
||||||
python-deps:
|
|
||||||
plugin: python
|
|
||||||
source: .
|
|
||||||
python-packages:
|
|
||||||
- requests==2.31.0
|
|
||||||
- pillow==10.3.0
|
|
||||||
- pypdf==4.2.0
|
|
||||||
- pytube==15.0.0
|
|
||||||
- html2text==2024.2.26
|
|
||||||
|
|
||||||
# Ollama plugin
|
|
||||||
ollama:
|
|
||||||
plugin: dump
|
|
||||||
source:
|
|
||||||
- on amd64: https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-amd64.tgz
|
|
||||||
- on arm64: https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-arm64.tgz
|
|
||||||
|
|
||||||
# Alpaca app
|
|
||||||
alpaca:
|
|
||||||
plugin: meson
|
|
||||||
source-type: git
|
|
||||||
source: https://github.com/Jeffser/Alpaca.git
|
|
||||||
source-tag: 2.6.5
|
|
||||||
source-depth: 1
|
|
||||||
meson-parameters:
|
|
||||||
- --prefix=/snap/alpaca/current/usr
|
|
||||||
override-build: |
|
|
||||||
craftctl default
|
|
||||||
sed -i '1c#!/usr/bin/env python3' $CRAFT_PART_INSTALL/snap/alpaca/current/usr/bin/alpaca
|
|
||||||
parse-info:
|
|
||||||
- usr/share/metainfo/com.jeffser.Alpaca.metainfo.xml
|
|
||||||
organize:
|
|
||||||
snap/alpaca/current: .
|
|
||||||
after: [python-deps]
|
|
||||||
|
|
||||||
deps:
|
|
||||||
plugin: nil
|
|
||||||
after: [alpaca]
|
|
||||||
stage-packages:
|
|
||||||
- libnuma1
|
|
||||||
prime:
|
|
||||||
- usr/lib/*/libnuma.so.1*
|
|
@ -31,9 +31,6 @@
|
|||||||
<file alias="icons/scalable/status/update-symbolic.svg">icons/update-symbolic.svg</file>
|
<file alias="icons/scalable/status/update-symbolic.svg">icons/update-symbolic.svg</file>
|
||||||
<file alias="icons/scalable/status/down-symbolic.svg">icons/down-symbolic.svg</file>
|
<file alias="icons/scalable/status/down-symbolic.svg">icons/down-symbolic.svg</file>
|
||||||
<file alias="icons/scalable/status/chat-bubble-text-symbolic.svg">icons/chat-bubble-text-symbolic.svg</file>
|
<file alias="icons/scalable/status/chat-bubble-text-symbolic.svg">icons/chat-bubble-text-symbolic.svg</file>
|
||||||
<file alias="icons/scalable/status/execute-from-symbolic.svg">icons/execute-from-symbolic.svg</file>
|
|
||||||
<file alias="icons/scalable/status/cross-large-symbolic.svg">icons/cross-large-symbolic.svg</file>
|
|
||||||
<file alias="icons/scalable/status/info-outline-symbolic.svg">icons/info-outline-symbolic.svg</file>
|
|
||||||
<file preprocess="xml-stripblanks">window.ui</file>
|
<file preprocess="xml-stripblanks">window.ui</file>
|
||||||
<file preprocess="xml-stripblanks">gtk/help-overlay.ui</file>
|
<file preprocess="xml-stripblanks">gtk/help-overlay.ui</file>
|
||||||
</gresource>
|
</gresource>
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -1,13 +1,11 @@
|
|||||||
descriptions = {
|
descriptions = {
|
||||||
'llama3.2': _("Meta's Llama 3.2 goes small with 1B and 3B models."),
|
|
||||||
'llama3.1': _("Llama 3.1 is a new state-of-the-art model from Meta available in 8B, 70B and 405B parameter sizes."),
|
'llama3.1': _("Llama 3.1 is a new state-of-the-art model from Meta available in 8B, 70B and 405B parameter sizes."),
|
||||||
'gemma2': _("Google Gemma 2 is a high-performing and efficient model available in three sizes: 2B, 9B, and 27B."),
|
'gemma2': _("Google Gemma 2 is a high-performing and efficient model by now available in three sizes: 2B, 9B, and 27B."),
|
||||||
'qwen2.5': _("Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset, encompassing up to 18 trillion tokens. The model supports up to 128K tokens and has multilingual support."),
|
|
||||||
'phi3.5': _("A lightweight AI model with 3.8 billion parameters with performance overtaking similarly and larger sized models."),
|
|
||||||
'nemotron-mini': _("A commercial-friendly small language model by NVIDIA optimized for roleplay, RAG QA, and function calling."),
|
|
||||||
'mistral-small': _("Mistral Small is a lightweight model designed for cost-effective use in tasks like translation and summarization."),
|
|
||||||
'mistral-nemo': _("A state-of-the-art 12B model with 128k context length, built by Mistral AI in collaboration with NVIDIA."),
|
'mistral-nemo': _("A state-of-the-art 12B model with 128k context length, built by Mistral AI in collaboration with NVIDIA."),
|
||||||
|
'mistral-large': _("Mistral Large 2 is Mistral's new flagship model that is significantly more capable in code generation, mathematics, and reasoning with 128k context window and support for dozens of languages."),
|
||||||
|
'qwen2': _("Qwen2 is a new series of large language models from Alibaba group"),
|
||||||
'deepseek-coder-v2': _("An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks."),
|
'deepseek-coder-v2': _("An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks."),
|
||||||
|
'phi3': _("Phi-3 is a family of lightweight 3B (Mini) and 14B (Medium) state-of-the-art open models by Microsoft."),
|
||||||
'mistral': _("The 7B model released by Mistral AI, updated to version 0.3."),
|
'mistral': _("The 7B model released by Mistral AI, updated to version 0.3."),
|
||||||
'mixtral': _("A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes."),
|
'mixtral': _("A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes."),
|
||||||
'codegemma': _("CodeGemma is a collection of powerful, lightweight models that can perform a variety of coding tasks like fill-in-the-middle code completion, code generation, natural language understanding, mathematical reasoning, and instruction following."),
|
'codegemma': _("CodeGemma is a collection of powerful, lightweight models that can perform a variety of coding tasks like fill-in-the-middle code completion, code generation, natural language understanding, mathematical reasoning, and instruction following."),
|
||||||
@ -17,108 +15,98 @@ descriptions = {
|
|||||||
'llama3': _("Meta Llama 3: The most capable openly available LLM to date"),
|
'llama3': _("Meta Llama 3: The most capable openly available LLM to date"),
|
||||||
'gemma': _("Gemma is a family of lightweight, state-of-the-art open models built by Google DeepMind. Updated to version 1.1"),
|
'gemma': _("Gemma is a family of lightweight, state-of-the-art open models built by Google DeepMind. Updated to version 1.1"),
|
||||||
'qwen': _("Qwen 1.5 is a series of large language models by Alibaba Cloud spanning from 0.5B to 110B parameters"),
|
'qwen': _("Qwen 1.5 is a series of large language models by Alibaba Cloud spanning from 0.5B to 110B parameters"),
|
||||||
'qwen2': _("Qwen2 is a new series of large language models from Alibaba group"),
|
|
||||||
'phi3': _("Phi-3 is a family of lightweight 3B (Mini) and 14B (Medium) state-of-the-art open models by Microsoft."),
|
|
||||||
'llama2': _("Llama 2 is a collection of foundation language models ranging from 7B to 70B parameters."),
|
'llama2': _("Llama 2 is a collection of foundation language models ranging from 7B to 70B parameters."),
|
||||||
'codellama': _("A large language model that can use text prompts to generate and discuss code."),
|
'codellama': _("A large language model that can use text prompts to generate and discuss code."),
|
||||||
'nomic-embed-text': _("A high-performing open embedding model with a large token context window."),
|
'nomic-embed-text': _("A high-performing open embedding model with a large token context window."),
|
||||||
'mxbai-embed-large': _("State-of-the-art large embedding model from mixedbread.ai"),
|
|
||||||
'dolphin-mixtral': _("Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford."),
|
'dolphin-mixtral': _("Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford."),
|
||||||
'phi': _("Phi-2: a 2.7B language model by Microsoft Research that demonstrates outstanding reasoning and language understanding capabilities."),
|
'phi': _("Phi-2: a 2.7B language model by Microsoft Research that demonstrates outstanding reasoning and language understanding capabilities."),
|
||||||
'deepseek-coder': _("DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens."),
|
|
||||||
'starcoder2': _("StarCoder2 is the next generation of transparently trained open code LLMs that comes in three sizes: 3B, 7B and 15B parameters."),
|
|
||||||
'llama2-uncensored': _("Uncensored Llama 2 model by George Sung and Jarrad Hope."),
|
'llama2-uncensored': _("Uncensored Llama 2 model by George Sung and Jarrad Hope."),
|
||||||
'dolphin-mistral': _("The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8."),
|
'deepseek-coder': _("DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens."),
|
||||||
|
'mxbai-embed-large': _("State-of-the-art large embedding model from mixedbread.ai"),
|
||||||
'zephyr': _("Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants."),
|
'zephyr': _("Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants."),
|
||||||
'yi': _("Yi 1.5 is a high-performing, bilingual language model."),
|
'dolphin-mistral': _("The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8."),
|
||||||
'dolphin-llama3': _("Dolphin 2.9 is a new model with 8B and 70B sizes by Eric Hartford based on Llama 3 that has a variety of instruction, conversational, and coding skills."),
|
'starcoder2': _("StarCoder2 is the next generation of transparently trained open code LLMs that comes in three sizes: 3B, 7B and 15B parameters."),
|
||||||
'orca-mini': _("A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware."),
|
'orca-mini': _("A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware."),
|
||||||
'llava-llama3': _("A LLaVA model fine-tuned from Llama 3 Instruct with better scores in several benchmarks."),
|
'dolphin-llama3': _("Dolphin 2.9 is a new model with 8B and 70B sizes by Eric Hartford based on Llama 3 that has a variety of instruction, conversational, and coding skills."),
|
||||||
'qwen2.5-coder': _("The latest series of Code-Specific Qwen models, with significant improvements in code generation, code reasoning, and code fixing."),
|
'yi': _("Yi 1.5 is a high-performing, bilingual language model."),
|
||||||
'mistral-openorca': _("Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset."),
|
'mistral-openorca': _("Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset."),
|
||||||
|
'llava-llama3': _("A LLaVA model fine-tuned from Llama 3 Instruct with better scores in several benchmarks."),
|
||||||
'starcoder': _("StarCoder is a code generation model trained on 80+ programming languages."),
|
'starcoder': _("StarCoder is a code generation model trained on 80+ programming languages."),
|
||||||
|
'llama2-chinese': _("Llama 2 based model fine tuned to improve Chinese dialogue ability."),
|
||||||
|
'vicuna': _("General use chat model based on Llama and Llama 2 with 2K to 16K context sizes."),
|
||||||
'tinyllama': _("The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens."),
|
'tinyllama': _("The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens."),
|
||||||
'codestral': _("Codestral is Mistral AI’s first-ever code model designed for code generation tasks."),
|
'codestral': _("Codestral is Mistral AI’s first-ever code model designed for code generation tasks."),
|
||||||
'vicuna': _("General use chat model based on Llama and Llama 2 with 2K to 16K context sizes."),
|
|
||||||
'llama2-chinese': _("Llama 2 based model fine tuned to improve Chinese dialogue ability."),
|
|
||||||
'snowflake-arctic-embed': _("A suite of text embedding models by Snowflake, optimized for performance."),
|
|
||||||
'wizard-vicuna-uncensored': _("Wizard Vicuna Uncensored is a 7B, 13B, and 30B parameter model based on Llama 2 uncensored by Eric Hartford."),
|
'wizard-vicuna-uncensored': _("Wizard Vicuna Uncensored is a 7B, 13B, and 30B parameter model based on Llama 2 uncensored by Eric Hartford."),
|
||||||
'granite-code': _("A family of open foundation models by IBM for Code Intelligence"),
|
|
||||||
'codegeex4': _("A versatile model for AI software development scenarios, including code completion."),
|
|
||||||
'nous-hermes2': _("The powerful family of models by Nous Research that excels at scientific discussion and coding tasks."),
|
'nous-hermes2': _("The powerful family of models by Nous Research that excels at scientific discussion and coding tasks."),
|
||||||
'all-minilm': _("Embedding models on very large sentence level datasets."),
|
|
||||||
'openchat': _("A family of open-source models trained on a wide variety of data, surpassing ChatGPT on various benchmarks. Updated to version 3.5-0106."),
|
'openchat': _("A family of open-source models trained on a wide variety of data, surpassing ChatGPT on various benchmarks. Updated to version 3.5-0106."),
|
||||||
'aya': _("Aya 23, released by Cohere, is a new family of state-of-the-art, multilingual models that support 23 languages."),
|
'aya': _("Aya 23, released by Cohere, is a new family of state-of-the-art, multilingual models that support 23 languages."),
|
||||||
'codeqwen': _("CodeQwen1.5 is a large language model pretrained on a large amount of code data."),
|
|
||||||
'wizardlm2': _("State of the art large language model from Microsoft AI with improved performance on complex chat, multilingual, reasoning and agent use cases."),
|
'wizardlm2': _("State of the art large language model from Microsoft AI with improved performance on complex chat, multilingual, reasoning and agent use cases."),
|
||||||
'tinydolphin': _("An experimental 1.1B parameter model trained on the new Dolphin 2.8 dataset by Eric Hartford and based on TinyLlama."),
|
'tinydolphin': _("An experimental 1.1B parameter model trained on the new Dolphin 2.8 dataset by Eric Hartford and based on TinyLlama."),
|
||||||
|
'granite-code': _("A family of open foundation models by IBM for Code Intelligence"),
|
||||||
'wizardcoder': _("State-of-the-art code generation model"),
|
'wizardcoder': _("State-of-the-art code generation model"),
|
||||||
'stable-code': _("Stable Code 3B is a coding model with instruct and code completion variants on par with models such as Code Llama 7B that are 2.5x larger."),
|
'stable-code': _("Stable Code 3B is a coding model with instruct and code completion variants on par with models such as Code Llama 7B that are 2.5x larger."),
|
||||||
'openhermes': _("OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets."),
|
'openhermes': _("OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets."),
|
||||||
'qwen2-math': _("Qwen2 Math is a series of specialized math language models built upon the Qwen2 LLMs, which significantly outperforms the mathematical capabilities of open-source models and even closed-source models (e.g., GPT4o)."),
|
'all-minilm': _("Embedding models on very large sentence level datasets."),
|
||||||
'bakllava': _("BakLLaVA is a multimodal model consisting of the Mistral 7B base model augmented with the LLaVA architecture."),
|
'codeqwen': _("CodeQwen1.5 is a large language model pretrained on a large amount of code data."),
|
||||||
'stablelm2': _("Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch."),
|
'stablelm2': _("Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch."),
|
||||||
'llama3-gradient': _("This model extends LLama-3 8B's context length from 8k to over 1m tokens."),
|
|
||||||
'deepseek-llm': _("An advanced language model crafted with 2 trillion bilingual tokens."),
|
|
||||||
'wizard-math': _("Model focused on math and logic problems"),
|
'wizard-math': _("Model focused on math and logic problems"),
|
||||||
'glm4': _("A strong multi-lingual general language model with competitive performance to Llama 3."),
|
|
||||||
'neural-chat': _("A fine-tuned model based on Mistral with good coverage of domain and language."),
|
'neural-chat': _("A fine-tuned model based on Mistral with good coverage of domain and language."),
|
||||||
'reflection': _("A high-performing model trained with a new technique called Reflection-tuning that teaches a LLM to detect mistakes in its reasoning and correct course."),
|
'llama3-gradient': _("This model extends LLama-3 8B's context length from 8k to over 1m tokens."),
|
||||||
'llama3-chatqa': _("A model from NVIDIA based on Llama 3 that excels at conversational question answering (QA) and retrieval-augmented generation (RAG)."),
|
|
||||||
'mistral-large': _("Mistral Large 2 is Mistral's new flagship model that is significantly more capable in code generation, mathematics, and reasoning with 128k context window and support for dozens of languages."),
|
|
||||||
'moondream': _("moondream2 is a small vision language model designed to run efficiently on edge devices."),
|
|
||||||
'xwinlm': _("Conversational model based on Llama 2 that performs competitively on various benchmarks."),
|
|
||||||
'phind-codellama': _("Code generation model based on Code Llama."),
|
'phind-codellama': _("Code generation model based on Code Llama."),
|
||||||
'nous-hermes': _("General use models based on Llama and Llama 2 from Nous Research."),
|
'nous-hermes': _("General use models based on Llama and Llama 2 from Nous Research."),
|
||||||
'sqlcoder': _("SQLCoder is a code completion model fined-tuned on StarCoder for SQL generation tasks"),
|
|
||||||
'dolphincoder': _("A 7B and 15B uncensored variant of the Dolphin model family that excels at coding, based on StarCoder2."),
|
'dolphincoder': _("A 7B and 15B uncensored variant of the Dolphin model family that excels at coding, based on StarCoder2."),
|
||||||
|
'sqlcoder': _("SQLCoder is a code completion model fined-tuned on StarCoder for SQL generation tasks"),
|
||||||
|
'xwinlm': _("Conversational model based on Llama 2 that performs competitively on various benchmarks."),
|
||||||
|
'deepseek-llm': _("An advanced language model crafted with 2 trillion bilingual tokens."),
|
||||||
'yarn-llama2': _("An extension of Llama 2 that supports a context of up to 128k tokens."),
|
'yarn-llama2': _("An extension of Llama 2 that supports a context of up to 128k tokens."),
|
||||||
'smollm': _("🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset."),
|
'llama3-chatqa': _("A model from NVIDIA based on Llama 3 that excels at conversational question answering (QA) and retrieval-augmented generation (RAG)."),
|
||||||
'wizardlm': _("General use model based on Llama 2."),
|
'wizardlm': _("General use model based on Llama 2."),
|
||||||
'deepseek-v2': _("A strong, economical, and efficient Mixture-of-Experts language model."),
|
|
||||||
'starling-lm': _("Starling is a large language model trained by reinforcement learning from AI feedback focused on improving chatbot helpfulness."),
|
'starling-lm': _("Starling is a large language model trained by reinforcement learning from AI feedback focused on improving chatbot helpfulness."),
|
||||||
'samantha-mistral': _("A companion assistant trained in philosophy, psychology, and personal relationships. Based on Mistral."),
|
'codegeex4': _("A versatile model for AI software development scenarios, including code completion."),
|
||||||
'solar': _("A compact, yet powerful 10.7B large language model designed for single-turn conversation."),
|
'snowflake-arctic-embed': _("A suite of text embedding models by Snowflake, optimized for performance."),
|
||||||
'orca2': _("Orca 2 is built by Microsoft research, and are a fine-tuned version of Meta's Llama 2 models. The model is designed to excel particularly in reasoning."),
|
'orca2': _("Orca 2 is built by Microsoft research, and are a fine-tuned version of Meta's Llama 2 models. The model is designed to excel particularly in reasoning."),
|
||||||
'stable-beluga': _("Llama 2 based model fine tuned on an Orca-style dataset. Originally called Free Willy."),
|
'solar': _("A compact, yet powerful 10.7B large language model designed for single-turn conversation."),
|
||||||
|
'samantha-mistral': _("A companion assistant trained in philosophy, psychology, and personal relationships. Based on Mistral."),
|
||||||
|
'moondream': _("moondream2 is a small vision language model designed to run efficiently on edge devices."),
|
||||||
|
'smollm': _("🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset."),
|
||||||
|
'stable-beluga': _("🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset."),
|
||||||
|
'qwen2-math': _("Qwen2 Math is a series of specialized math language models built upon the Qwen2 LLMs, which significantly outperforms the mathematical capabilities of open-source models and even closed-source models (e.g., GPT4o)."),
|
||||||
'dolphin-phi': _("2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research."),
|
'dolphin-phi': _("2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research."),
|
||||||
|
'deepseek-v2': _("A strong, economical, and efficient Mixture-of-Experts language model."),
|
||||||
|
'bakllava': _("BakLLaVA is a multimodal model consisting of the Mistral 7B base model augmented with the LLaVA architecture."),
|
||||||
|
'glm4': _("A strong multi-lingual general language model with competitive performance to Llama 3."),
|
||||||
'wizardlm-uncensored': _("Uncensored version of Wizard LM model"),
|
'wizardlm-uncensored': _("Uncensored version of Wizard LM model"),
|
||||||
'hermes3': _("Hermes 3 is the latest version of the flagship Hermes series of LLMs by Nous Research"),
|
|
||||||
'yi-coder': _("Yi-Coder is a series of open-source code language models that delivers state-of-the-art coding performance with fewer than 10 billion parameters."),
|
|
||||||
'llava-phi3': _("A new small LLaVA model fine-tuned from Phi 3 Mini."),
|
|
||||||
'internlm2': _("InternLM2.5 is a 7B parameter model tailored for practical scenarios with outstanding reasoning capability."),
|
|
||||||
'yarn-mistral': _("An extension of Mistral to support context windows of 64K or 128K."),
|
'yarn-mistral': _("An extension of Mistral to support context windows of 64K or 128K."),
|
||||||
'llama-pro': _("An expansion of Llama 2 that specializes in integrating both general language understanding and domain-specific knowledge, particularly in programming and mathematics."),
|
'phi3.5': _("A lightweight AI model with 3.8 billion parameters with performance overtaking similarly and larger sized models."),
|
||||||
'medllama2': _("Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset."),
|
'medllama2': _("Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset."),
|
||||||
|
'llama-pro': _("An expansion of Llama 2 that specializes in integrating both general language understanding and domain-specific knowledge, particularly in programming and mathematics."),
|
||||||
|
'llava-phi3': _("A new small LLaVA model fine-tuned from Phi 3 Mini."),
|
||||||
'meditron': _("Open-source medical large language model adapted from Llama 2 to the medical domain."),
|
'meditron': _("Open-source medical large language model adapted from Llama 2 to the medical domain."),
|
||||||
'nexusraven': _("Nexus Raven is a 13B instruction tuned model for function calling tasks."),
|
|
||||||
'nous-hermes2-mixtral': _("The Nous Hermes 2 model from Nous Research, now trained over Mixtral."),
|
'nous-hermes2-mixtral': _("The Nous Hermes 2 model from Nous Research, now trained over Mixtral."),
|
||||||
|
'nexusraven': _("Nexus Raven is a 13B instruction tuned model for function calling tasks."),
|
||||||
'codeup': _("Great code generation model based on Llama2."),
|
'codeup': _("Great code generation model based on Llama2."),
|
||||||
'llama3-groq-tool-use': _("A series of models from Groq that represent a significant advancement in open-source AI capabilities for tool use/function calling."),
|
|
||||||
'everythinglm': _("Uncensored Llama2 based model with support for a 16K context window."),
|
'everythinglm': _("Uncensored Llama2 based model with support for a 16K context window."),
|
||||||
|
'hermes3': _("Hermes 3 is the latest version of the flagship Hermes series of LLMs by Nous Research"),
|
||||||
|
'internlm2': _("InternLM2.5 is a 7B parameter model tailored for practical scenarios with outstanding reasoning capability."),
|
||||||
'magicoder': _("🎩 Magicoder is a family of 7B parameter models trained on 75K synthetic instruction data using OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets."),
|
'magicoder': _("🎩 Magicoder is a family of 7B parameter models trained on 75K synthetic instruction data using OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets."),
|
||||||
'stablelm-zephyr': _("A lightweight chat model allowing accurate, and responsive output without requiring high-end hardware."),
|
'stablelm-zephyr': _("A lightweight chat model allowing accurate, and responsive output without requiring high-end hardware."),
|
||||||
'codebooga': _("A high-performing code instruct model created by merging two existing code models."),
|
'codebooga': _("A high-performing code instruct model created by merging two existing code models."),
|
||||||
'wizard-vicuna': _("Wizard Vicuna is a 13B parameter model based on Llama 2 trained by MelodysDreamj."),
|
|
||||||
'mistrallite': _("MistralLite is a fine-tuned model based on Mistral with enhanced capabilities of processing long contexts."),
|
'mistrallite': _("MistralLite is a fine-tuned model based on Mistral with enhanced capabilities of processing long contexts."),
|
||||||
|
'llama3-groq-tool-use': _("A series of models from Groq that represent a significant advancement in open-source AI capabilities for tool use/function calling."),
|
||||||
'falcon2': _("Falcon2 is an 11B parameters causal decoder-only model built by TII and trained over 5T tokens."),
|
'falcon2': _("Falcon2 is an 11B parameters causal decoder-only model built by TII and trained over 5T tokens."),
|
||||||
|
'wizard-vicuna': _("Wizard Vicuna is a 13B parameter model based on Llama 2 trained by MelodysDreamj."),
|
||||||
'duckdb-nsql': _("7B parameter text-to-SQL model made by MotherDuck and Numbers Station."),
|
'duckdb-nsql': _("7B parameter text-to-SQL model made by MotherDuck and Numbers Station."),
|
||||||
'minicpm-v': _("A series of multimodal LLMs (MLLMs) designed for vision-language understanding."),
|
|
||||||
'megadolphin': _("MegaDolphin-2.2-120b is a transformation of Dolphin-2.2-70b created by interleaving the model with itself."),
|
'megadolphin': _("MegaDolphin-2.2-120b is a transformation of Dolphin-2.2-70b created by interleaving the model with itself."),
|
||||||
'notux': _("A top-performing mixture of experts model, fine-tuned with high-quality data."),
|
'notux': _("A top-performing mixture of experts model, fine-tuned with high-quality data."),
|
||||||
'goliath': _("A language model created by combining two fine-tuned Llama 2 70B models into one."),
|
'goliath': _("A language model created by combining two fine-tuned Llama 2 70B models into one."),
|
||||||
'open-orca-platypus2': _("Merge of the Open Orca OpenChat model and the Garage-bAInd Platypus 2 model. Designed for chat and code generation."),
|
'open-orca-platypus2': _("Merge of the Open Orca OpenChat model and the Garage-bAInd Platypus 2 model. Designed for chat and code generation."),
|
||||||
'notus': _("A 7B chat model fine-tuned with high-quality data and based on Zephyr."),
|
'notus': _("A 7B chat model fine-tuned with high-quality data and based on Zephyr."),
|
||||||
'bge-m3': _("BGE-M3 is a new model from BAAI distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity."),
|
|
||||||
'mathstral': _("MathΣtral: a 7B model designed for math reasoning and scientific discovery by Mistral AI."),
|
|
||||||
'dbrx': _("DBRX is an open, general-purpose LLM created by Databricks."),
|
'dbrx': _("DBRX is an open, general-purpose LLM created by Databricks."),
|
||||||
'solar-pro': _("Solar Pro Preview: an advanced large language model (LLM) with 22 billion parameters designed to fit into a single GPU"),
|
'mathstral': _("MathΣtral: a 7B model designed for math reasoning and scientific discovery by Mistral AI."),
|
||||||
'nuextract': _("A 3.8B model fine-tuned on a private high-quality synthetic dataset for information extraction, based on Phi-3."),
|
'bge-m3': _("BGE-M3 is a new model from BAAI distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity."),
|
||||||
'alfred': _("A robust conversational model designed to be used for both chat and instruct use cases."),
|
'alfred': _("A robust conversational model designed to be used for both chat and instruct use cases."),
|
||||||
'firefunction-v2': _("An open weights function calling model based on Llama 3, competitive with GPT-4o function calling capabilities."),
|
'firefunction-v2': _("An open weights function calling model based on Llama 3, competitive with GPT-4o function calling capabilities."),
|
||||||
'reader-lm': _("A series of models that convert HTML content to Markdown content, which is useful for content conversion tasks."),
|
'nuextract': _("A 3.8B model fine-tuned on a private high-quality synthetic dataset for information extraction, based on Phi-3."),
|
||||||
'bge-large': _("Embedding model from BAAI mapping texts to vectors."),
|
'bge-large': _("Embedding model from BAAI mapping texts to vectors."),
|
||||||
'deepseek-v2.5': _("An upgraded version of DeekSeek-V2 that integrates the general and coding abilities of both DeepSeek-V2-Chat and DeepSeek-Coder-V2-Instruct."),
|
|
||||||
'bespoke-minicheck': _("A state-of-the-art fact-checking model developed by Bespoke Labs."),
|
|
||||||
'paraphrase-multilingual': _("Sentence-transformers model that can be used for tasks like clustering or semantic search."),
|
'paraphrase-multilingual': _("Sentence-transformers model that can be used for tasks like clustering or semantic search."),
|
||||||
}
|
}
|
@ -11,8 +11,6 @@ logger = getLogger(__name__)
|
|||||||
|
|
||||||
window = None
|
window = None
|
||||||
|
|
||||||
AMD_support_label = "\n<a href='https://github.com/Jeffser/Alpaca/wiki/AMD-Support'>{}</a>".format(_('Alpaca Support'))
|
|
||||||
|
|
||||||
def log_output(pipe):
|
def log_output(pipe):
|
||||||
with open(os.path.join(data_dir, 'tmp.log'), 'a') as f:
|
with open(os.path.join(data_dir, 'tmp.log'), 'a') as f:
|
||||||
with pipe:
|
with pipe:
|
||||||
@ -21,18 +19,7 @@ def log_output(pipe):
|
|||||||
print(line, end='')
|
print(line, end='')
|
||||||
f.write(line)
|
f.write(line)
|
||||||
f.flush()
|
f.flush()
|
||||||
if 'msg="model request too large for system"' in line:
|
except:
|
||||||
window.show_toast(_("Model request too large for system"), window.main_overlay)
|
|
||||||
elif 'msg="amdgpu detected, but no compatible rocm library found.' in line:
|
|
||||||
if bool(os.getenv("FLATPAK_ID")):
|
|
||||||
window.ollama_information_label.set_label(_("AMD GPU detected but the extension is missing, Ollama will use CPU.") + AMD_support_label)
|
|
||||||
else:
|
|
||||||
window.ollama_information_label.set_label(_("AMD GPU detected but ROCm is missing, Ollama will use CPU.") + AMD_support_label)
|
|
||||||
window.ollama_information_label.set_css_classes(['dim-label', 'error'])
|
|
||||||
elif 'msg="amdgpu is supported"' in line:
|
|
||||||
window.ollama_information_label.set_label(_("Using AMD GPU type '{}'").format(line.split('=')[-1]))
|
|
||||||
window.ollama_information_label.set_css_classes(['dim-label', 'success'])
|
|
||||||
except Exception as e:
|
|
||||||
pass
|
pass
|
||||||
|
|
||||||
class instance():
|
class instance():
|
||||||
@ -105,7 +92,6 @@ class instance():
|
|||||||
self.idle_timer.start()
|
self.idle_timer.start()
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
self.stop()
|
|
||||||
if shutil.which('ollama'):
|
if shutil.which('ollama'):
|
||||||
if not os.path.isdir(os.path.join(cache_dir, 'tmp/ollama')):
|
if not os.path.isdir(os.path.join(cache_dir, 'tmp/ollama')):
|
||||||
os.mkdir(os.path.join(cache_dir, 'tmp/ollama'))
|
os.mkdir(os.path.join(cache_dir, 'tmp/ollama'))
|
||||||
@ -129,10 +115,10 @@ class instance():
|
|||||||
self.instance = instance
|
self.instance = instance
|
||||||
if not self.idle_timer:
|
if not self.idle_timer:
|
||||||
self.start_timer()
|
self.start_timer()
|
||||||
window.ollama_information_label.set_label(_("Integrated Ollama instance is running"))
|
|
||||||
window.ollama_information_label.set_css_classes(['dim-label', 'success'])
|
|
||||||
else:
|
else:
|
||||||
self.remote = True
|
self.remote = True
|
||||||
|
if not self.remote_url:
|
||||||
|
window.remote_connection_entry.set_text('http://0.0.0.0:11434')
|
||||||
window.remote_connection_switch.set_sensitive(True)
|
window.remote_connection_switch.set_sensitive(True)
|
||||||
window.remote_connection_switch.set_active(True)
|
window.remote_connection_switch.set_active(True)
|
||||||
|
|
||||||
@ -145,8 +131,6 @@ class instance():
|
|||||||
self.instance.terminate()
|
self.instance.terminate()
|
||||||
self.instance.wait()
|
self.instance.wait()
|
||||||
self.instance = None
|
self.instance = None
|
||||||
window.ollama_information_label.set_label(_("Integrated Ollama instance is not running"))
|
|
||||||
window.ollama_information_label.set_css_classes(['dim-label'])
|
|
||||||
logger.info("Stopped Alpaca's Ollama instance")
|
logger.info("Stopped Alpaca's Ollama instance")
|
||||||
|
|
||||||
def reset(self):
|
def reset(self):
|
||||||
|
@ -6,7 +6,7 @@ Handles the chat widget (testing)
|
|||||||
import gi
|
import gi
|
||||||
gi.require_version('Gtk', '4.0')
|
gi.require_version('Gtk', '4.0')
|
||||||
gi.require_version('GtkSource', '5')
|
gi.require_version('GtkSource', '5')
|
||||||
from gi.repository import Gtk, Gio, Adw, Gdk, GLib
|
from gi.repository import Gtk, Gio, Adw, Gdk
|
||||||
import logging, os, datetime, shutil, random, tempfile, tarfile, json
|
import logging, os, datetime, shutil, random, tempfile, tarfile, json
|
||||||
from ..internal import data_dir
|
from ..internal import data_dir
|
||||||
from .message_widget import message
|
from .message_widget import message
|
||||||
@ -66,15 +66,13 @@ class chat(Gtk.ScrolledWindow):
|
|||||||
vexpand=True,
|
vexpand=True,
|
||||||
hexpand=True,
|
hexpand=True,
|
||||||
css_classes=["undershoot-bottom"],
|
css_classes=["undershoot-bottom"],
|
||||||
name=name,
|
name=name
|
||||||
hscrollbar_policy=2
|
|
||||||
)
|
)
|
||||||
self.messages = {}
|
self.messages = {}
|
||||||
self.welcome_screen = None
|
self.welcome_screen = None
|
||||||
self.regenerate_button = None
|
self.regenerate_button = None
|
||||||
self.busy = False
|
self.busy = False
|
||||||
#self.get_vadjustment().connect('notify::page-size', lambda va, *_: va.set_value(va.get_upper() - va.get_page_size()) if va.get_value() == 0 else None)
|
self.get_vadjustment().connect('notify::page-size', lambda va, *_: va.set_value(va.get_upper() - va.get_page_size()) if va.get_value() == 0 else None)
|
||||||
##TODO Figure out how to do this with the search thing
|
|
||||||
|
|
||||||
def stop_message(self):
|
def stop_message(self):
|
||||||
self.busy = False
|
self.busy = False
|
||||||
@ -87,8 +85,6 @@ class chat(Gtk.ScrolledWindow):
|
|||||||
self.stop_message()
|
self.stop_message()
|
||||||
for widget in list(self.container):
|
for widget in list(self.container):
|
||||||
self.container.remove(widget)
|
self.container.remove(widget)
|
||||||
self.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
|
|
||||||
print('clear chat for some reason')
|
|
||||||
|
|
||||||
def add_message(self, message_id:str, model:str=None):
|
def add_message(self, message_id:str, model:str=None):
|
||||||
msg = message(message_id, model)
|
msg = message(message_id, model)
|
||||||
@ -105,9 +101,7 @@ class chat(Gtk.ScrolledWindow):
|
|||||||
if self.welcome_screen:
|
if self.welcome_screen:
|
||||||
self.container.remove(self.welcome_screen)
|
self.container.remove(self.welcome_screen)
|
||||||
self.welcome_screen = None
|
self.welcome_screen = None
|
||||||
if len(list(self.container)) > 0:
|
self.clear_chat()
|
||||||
self.clear_chat()
|
|
||||||
return
|
|
||||||
button_container = Gtk.Box(
|
button_container = Gtk.Box(
|
||||||
orientation=1,
|
orientation=1,
|
||||||
spacing=10,
|
spacing=10,
|
||||||
@ -127,7 +121,7 @@ class chat(Gtk.ScrolledWindow):
|
|||||||
tooltip_text=_("Open Model Manager"),
|
tooltip_text=_("Open Model Manager"),
|
||||||
css_classes=["suggested-action", "pill"]
|
css_classes=["suggested-action", "pill"]
|
||||||
)
|
)
|
||||||
button.set_action_name('app.manage_models')
|
button.connect('clicked', lambda *_ : window.manage_models_dialog.present(window))
|
||||||
button_container.append(button)
|
button_container.append(button)
|
||||||
|
|
||||||
self.welcome_screen = Adw.StatusPage(
|
self.welcome_screen = Adw.StatusPage(
|
||||||
@ -159,8 +153,8 @@ class chat(Gtk.ScrolledWindow):
|
|||||||
for file_name, file_type in message_data['files'].items():
|
for file_name, file_type in message_data['files'].items():
|
||||||
files[os.path.join(data_dir, "chats", self.get_name(), message_id, file_name)] = file_type
|
files[os.path.join(data_dir, "chats", self.get_name(), message_id, file_name)] = file_type
|
||||||
message_element.add_attachments(files)
|
message_element.add_attachments(files)
|
||||||
GLib.idle_add(message_element.set_text, message_data['content'])
|
message_element.set_text(message_data['content'])
|
||||||
GLib.idle_add(message_element.add_footer, datetime.datetime.strptime(message_data['date'] + (":00" if message_data['date'].count(":") == 1 else ""), '%Y/%m/%d %H:%M:%S'))
|
message_element.add_footer(datetime.datetime.strptime(message_data['date'] + (":00" if message_data['date'].count(":") == 1 else ""), '%Y/%m/%d %H:%M:%S'))
|
||||||
else:
|
else:
|
||||||
self.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
|
self.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
|
||||||
|
|
||||||
@ -338,8 +332,6 @@ class chat_list(Gtk.ListBox):
|
|||||||
window.save_history()
|
window.save_history()
|
||||||
|
|
||||||
def rename_chat(self, old_chat_name:str, new_chat_name:str):
|
def rename_chat(self, old_chat_name:str, new_chat_name:str):
|
||||||
if new_chat_name == old_chat_name:
|
|
||||||
return
|
|
||||||
tab = self.get_tab_by_name(old_chat_name)
|
tab = self.get_tab_by_name(old_chat_name)
|
||||||
if tab:
|
if tab:
|
||||||
new_chat_name = window.generate_numbered_name(new_chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
|
new_chat_name = window.generate_numbered_name(new_chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
|
||||||
@ -444,10 +436,6 @@ class chat_list(Gtk.ListBox):
|
|||||||
if row:
|
if row:
|
||||||
current_tab_i = next((i for i, t in enumerate(self.tab_list) if t.chat_window == window.chat_stack.get_visible_child()), -1)
|
current_tab_i = next((i for i, t in enumerate(self.tab_list) if t.chat_window == window.chat_stack.get_visible_child()), -1)
|
||||||
if self.tab_list.index(row) != current_tab_i:
|
if self.tab_list.index(row) != current_tab_i:
|
||||||
if window.searchentry_messages.get_text() != '':
|
|
||||||
window.searchentry_messages.set_text('')
|
|
||||||
window.message_search_changed(window.searchentry_messages, window.chat_stack.get_visible_child())
|
|
||||||
window.message_searchbar.set_search_mode(False)
|
|
||||||
window.chat_stack.set_transition_type(4 if self.tab_list.index(row) > current_tab_i else 5)
|
window.chat_stack.set_transition_type(4 if self.tab_list.index(row) > current_tab_i else 5)
|
||||||
window.chat_stack.set_visible_child(row.chat_window)
|
window.chat_stack.set_visible_child(row.chat_window)
|
||||||
window.switch_send_stop_button(not row.chat_window.busy)
|
window.switch_send_stop_button(not row.chat_window.busy)
|
||||||
|
@ -1,173 +0,0 @@
|
|||||||
#dialog_widget.py
|
|
||||||
"""
|
|
||||||
Handles all dialogs
|
|
||||||
"""
|
|
||||||
|
|
||||||
import gi
|
|
||||||
gi.require_version('Gtk', '4.0')
|
|
||||||
gi.require_version('GtkSource', '5')
|
|
||||||
from gi.repository import Gtk, Gio, Adw, Gdk, GLib
|
|
||||||
|
|
||||||
window=None
|
|
||||||
|
|
||||||
button_appearance={
|
|
||||||
'suggested': Adw.ResponseAppearance.SUGGESTED,
|
|
||||||
'destructive': Adw.ResponseAppearance.DESTRUCTIVE
|
|
||||||
}
|
|
||||||
|
|
||||||
# Don't call this directly outside this script
|
|
||||||
class baseDialog(Adw.AlertDialog):
|
|
||||||
__gtype_name__ = 'AlpacaDialogBase'
|
|
||||||
|
|
||||||
def __init__(self, heading:str, body:str, close_response:str, options:dict):
|
|
||||||
self.options = options
|
|
||||||
super().__init__(
|
|
||||||
heading=heading,
|
|
||||||
body=body,
|
|
||||||
close_response=close_response
|
|
||||||
)
|
|
||||||
for option, data in self.options.items():
|
|
||||||
self.add_response(option, option)
|
|
||||||
if 'appearance' in data:
|
|
||||||
self.set_response_appearance(option, button_appearance[data['appearance']])
|
|
||||||
if 'default' in data and data['default']:
|
|
||||||
self.set_default_response(option)
|
|
||||||
|
|
||||||
|
|
||||||
class Options(baseDialog):
|
|
||||||
__gtype_name__ = 'AlpacaDialogOptions'
|
|
||||||
|
|
||||||
def __init__(self, heading:str, body:str, close_response:str, options:dict):
|
|
||||||
super().__init__(
|
|
||||||
heading,
|
|
||||||
body,
|
|
||||||
close_response,
|
|
||||||
options
|
|
||||||
)
|
|
||||||
self.choose(
|
|
||||||
parent = window,
|
|
||||||
cancellable = None,
|
|
||||||
callback = self.response
|
|
||||||
)
|
|
||||||
|
|
||||||
def response(self, dialog, task):
|
|
||||||
result = dialog.choose_finish(task)
|
|
||||||
if result in self.options and 'callback' in self.options[result]:
|
|
||||||
self.options[result]['callback']()
|
|
||||||
|
|
||||||
class Entry(baseDialog):
|
|
||||||
__gtype_name__ = 'AlpacaDialogEntry'
|
|
||||||
|
|
||||||
def __init__(self, heading:str, body:str, close_response:str, options:dict, entries:list or dict):
|
|
||||||
super().__init__(
|
|
||||||
heading,
|
|
||||||
body,
|
|
||||||
close_response,
|
|
||||||
options
|
|
||||||
)
|
|
||||||
|
|
||||||
self.container = Gtk.Box(
|
|
||||||
orientation=1,
|
|
||||||
spacing=10
|
|
||||||
)
|
|
||||||
|
|
||||||
if isinstance(entries, dict):
|
|
||||||
entries = [entries]
|
|
||||||
|
|
||||||
for data in entries:
|
|
||||||
entry = Gtk.Entry()
|
|
||||||
if 'placeholder' in data and data['placeholder']:
|
|
||||||
entry.set_placeholder_text(data['placeholder'])
|
|
||||||
if 'css' in data and data['css']:
|
|
||||||
entry.set_css_classes(data['css'])
|
|
||||||
if 'text' in data and data['text']:
|
|
||||||
entry.set_text(data['text'])
|
|
||||||
self.container.append(entry)
|
|
||||||
|
|
||||||
self.set_extra_child(self.container)
|
|
||||||
|
|
||||||
self.connect('realize', lambda *_: list(self.container)[0].grab_focus())
|
|
||||||
self.choose(
|
|
||||||
parent = window,
|
|
||||||
cancellable = None,
|
|
||||||
callback = self.response
|
|
||||||
)
|
|
||||||
|
|
||||||
def response(self, dialog, task):
|
|
||||||
result = dialog.choose_finish(task)
|
|
||||||
if result in self.options and 'callback' in self.options[result]:
|
|
||||||
entry_results = []
|
|
||||||
for entry in list(self.container):
|
|
||||||
entry_results.append(entry.get_text())
|
|
||||||
self.options[result]['callback'](*entry_results)
|
|
||||||
|
|
||||||
class DropDown(baseDialog):
|
|
||||||
__gtype_name__ = 'AlpacaDialogDropDown'
|
|
||||||
|
|
||||||
def __init__(self, heading:str, body:str, close_response:str, options:dict, items:list):
|
|
||||||
super().__init__(
|
|
||||||
heading,
|
|
||||||
body,
|
|
||||||
close_response,
|
|
||||||
options
|
|
||||||
)
|
|
||||||
string_list = Gtk.StringList()
|
|
||||||
for item in items:
|
|
||||||
string_list.append(item)
|
|
||||||
self.set_extra_child(Gtk.DropDown(
|
|
||||||
enable_search=len(items) > 10,
|
|
||||||
model=string_list
|
|
||||||
))
|
|
||||||
|
|
||||||
self.connect('realize', lambda *_: self.get_extra_child().grab_focus())
|
|
||||||
self.choose(
|
|
||||||
parent = window,
|
|
||||||
cancellable = None,
|
|
||||||
callback = lambda dialog, task, dropdown=self.get_extra_child(): self.response(dialog, task, dropdown.get_selected_item().get_string())
|
|
||||||
)
|
|
||||||
|
|
||||||
def response(self, dialog, task, item:str):
|
|
||||||
result = dialog.choose_finish(task)
|
|
||||||
if result in self.options and 'callback' in self.options[result]:
|
|
||||||
self.options[result]['callback'](item)
|
|
||||||
|
|
||||||
def simple(heading:str, body:str, callback:callable, button_name:str=_('Accept'), button_appearance:str='suggested'):
|
|
||||||
options = {
|
|
||||||
_('Cancel'): {},
|
|
||||||
button_name: {
|
|
||||||
'appearance': button_appearance,
|
|
||||||
'callback': callback,
|
|
||||||
'default': True
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return Options(heading, body, 'cancel', options)
|
|
||||||
|
|
||||||
def simple_entry(heading:str, body:str, callback:callable, entries:list or dict, button_name:str=_('Accept'), button_appearance:str='suggested'):
|
|
||||||
options = {
|
|
||||||
_('Cancel'): {},
|
|
||||||
button_name: {
|
|
||||||
'appearance': button_appearance,
|
|
||||||
'callback': callback,
|
|
||||||
'default': True
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return Entry(heading, body, 'cancel', options, entries)
|
|
||||||
|
|
||||||
def simple_dropdown(heading:str, body:str, callback:callable, items:list, button_name:str=_('Accept'), button_appearance:str='suggested'):
|
|
||||||
options = {
|
|
||||||
_('Cancel'): {},
|
|
||||||
button_name: {
|
|
||||||
'appearance': button_appearance,
|
|
||||||
'callback': callback,
|
|
||||||
'default': True
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return DropDown(heading, body, 'cancel', options, items)
|
|
||||||
|
|
||||||
def simple_file(file_filter:Gtk.FileFilter, callback:callable):
|
|
||||||
file_dialog = Gtk.FileDialog(default_filter=file_filter)
|
|
||||||
file_dialog.open(window, None, lambda file_dialog, result: callback(file_dialog.open_finish(result)) if result else None)
|
|
||||||
|
|
@ -10,13 +10,12 @@ from gi.repository import Gtk, GObject, Gio, Adw, GtkSource, GLib, Gdk
|
|||||||
import logging, os, datetime, re, shutil, threading, sys
|
import logging, os, datetime, re, shutil, threading, sys
|
||||||
from ..internal import config_dir, data_dir, cache_dir, source_dir
|
from ..internal import config_dir, data_dir, cache_dir, source_dir
|
||||||
from .table_widget import TableWidget
|
from .table_widget import TableWidget
|
||||||
from . import dialog_widget, terminal_widget
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
window = None
|
window = None
|
||||||
|
|
||||||
class edit_text_block(Gtk.Box):
|
class edit_text_block(Gtk.TextView):
|
||||||
__gtype_name__ = 'AlpacaEditTextBlock'
|
__gtype_name__ = 'AlpacaEditTextBlock'
|
||||||
|
|
||||||
def __init__(self, text:str):
|
def __init__(self, text:str):
|
||||||
@ -27,71 +26,21 @@ class edit_text_block(Gtk.Box):
|
|||||||
margin_bottom=5,
|
margin_bottom=5,
|
||||||
margin_start=5,
|
margin_start=5,
|
||||||
margin_end=5,
|
margin_end=5,
|
||||||
|
css_classes=["view", "editing_message_textview"]
|
||||||
|
)
|
||||||
|
self.get_buffer().insert(self.get_buffer().get_start_iter(), text, len(text.encode('utf-8')))
|
||||||
|
enter_key_controller = Gtk.EventControllerKey.new()
|
||||||
|
enter_key_controller.connect("key-pressed", lambda controller, keyval, keycode, state: self.edit_message() if keyval==Gdk.KEY_Return and not (state & Gdk.ModifierType.SHIFT_MASK) else None)
|
||||||
|
self.add_controller(enter_key_controller)
|
||||||
|
|
||||||
spacing=5,
|
def edit_message(self):
|
||||||
orientation=1
|
self.get_parent().get_parent().action_buttons.set_visible(True)
|
||||||
)
|
self.get_parent().get_parent().set_text(self.get_buffer().get_text(self.get_buffer().get_start_iter(), self.get_buffer().get_end_iter(), False))
|
||||||
self.text_view = Gtk.TextView(
|
self.get_parent().get_parent().add_footer(self.get_parent().get_parent().dt)
|
||||||
halign=0,
|
window.save_history(self.get_parent().get_parent().get_parent().get_parent().get_parent().get_parent())
|
||||||
hexpand=True,
|
|
||||||
css_classes=["view", "editing_message_textview"],
|
|
||||||
wrap_mode=3
|
|
||||||
)
|
|
||||||
cancel_button = Gtk.Button(
|
|
||||||
vexpand=False,
|
|
||||||
valign=2,
|
|
||||||
halign=2,
|
|
||||||
tooltip_text=_("Cancel"),
|
|
||||||
css_classes=['flat', 'circular'],
|
|
||||||
icon_name='cross-large-symbolic'
|
|
||||||
)
|
|
||||||
cancel_button.connect('clicked', lambda *_: self.cancel_edit())
|
|
||||||
save_button = Gtk.Button(
|
|
||||||
vexpand=False,
|
|
||||||
valign=2,
|
|
||||||
halign=2,
|
|
||||||
tooltip_text=_("Save Message"),
|
|
||||||
css_classes=['flat', 'circular'],
|
|
||||||
icon_name='paper-plane-symbolic'
|
|
||||||
)
|
|
||||||
save_button.connect('clicked', lambda *_: self.edit_message())
|
|
||||||
self.append(self.text_view)
|
|
||||||
|
|
||||||
button_container = Gtk.Box(
|
|
||||||
halign=2,
|
|
||||||
spacing=5
|
|
||||||
)
|
|
||||||
button_container.append(cancel_button)
|
|
||||||
button_container.append(save_button)
|
|
||||||
self.append(button_container)
|
|
||||||
self.text_view.get_buffer().insert(self.text_view.get_buffer().get_start_iter(), text, len(text.encode('utf-8')))
|
|
||||||
key_controller = Gtk.EventControllerKey.new()
|
|
||||||
key_controller.connect("key-pressed", self.handle_key)
|
|
||||||
self.text_view.add_controller(key_controller)
|
|
||||||
|
|
||||||
def handle_key(self, controller, keyval, keycode, state):
|
|
||||||
if keyval==Gdk.KEY_Return and not (state & Gdk.ModifierType.SHIFT_MASK):
|
|
||||||
self.save_edit()
|
|
||||||
return True
|
|
||||||
elif keyval==Gdk.KEY_Escape:
|
|
||||||
self.cancel_edit()
|
|
||||||
return True
|
|
||||||
|
|
||||||
def save_edit(self):
|
|
||||||
message_element = self.get_parent().get_parent()
|
|
||||||
message_element.action_buttons.set_visible(True)
|
|
||||||
message_element.set_text(self.text_view.get_buffer().get_text(self.text_view.get_buffer().get_start_iter(), self.text_view.get_buffer().get_end_iter(), False))
|
|
||||||
message_element.add_footer(message_element.dt)
|
|
||||||
window.save_history(message_element.get_parent().get_parent().get_parent().get_parent())
|
|
||||||
self.get_parent().remove(self)
|
self.get_parent().remove(self)
|
||||||
window.show_toast(_("Message edited successfully"), window.main_overlay)
|
window.show_toast(_("Message edited successfully"), window.main_overlay)
|
||||||
|
return True
|
||||||
def cancel_edit(self):
|
|
||||||
message_element = self.get_parent().get_parent()
|
|
||||||
message_element.action_buttons.set_visible(True)
|
|
||||||
message_element.set_text(message_element.text)
|
|
||||||
message_element.add_footer(message_element.dt)
|
|
||||||
self.get_parent().remove(self)
|
|
||||||
|
|
||||||
class text_block(Gtk.Label):
|
class text_block(Gtk.Label):
|
||||||
__gtype_name__ = 'AlpacaTextBlock'
|
__gtype_name__ = 'AlpacaTextBlock'
|
||||||
@ -110,7 +59,7 @@ class text_block(Gtk.Label):
|
|||||||
selectable=True
|
selectable=True
|
||||||
)
|
)
|
||||||
self.update_property([4, 7], [_("Response message") if bot else _("User message"), False])
|
self.update_property([4, 7], [_("Response message") if bot else _("User message"), False])
|
||||||
self.connect('notify::has-focus', lambda *_: GLib.idle_add(self.remove_selection) if self.has_focus() else None)
|
self.connect('notify::has-focus', lambda *_: None if self.has_focus() else self.remove_selection() )
|
||||||
|
|
||||||
def remove_selection(self):
|
def remove_selection(self):
|
||||||
self.set_selectable(False)
|
self.set_selectable(False)
|
||||||
@ -154,14 +103,10 @@ class code_block(Gtk.Box):
|
|||||||
self.source_view.update_property([4], [_("{}Code Block").format('{} '.format(self.language.get_name()) if self.language else "")])
|
self.source_view.update_property([4], [_("{}Code Block").format('{} '.format(self.language.get_name()) if self.language else "")])
|
||||||
|
|
||||||
title_box = Gtk.Box(margin_start=12, margin_top=3, margin_bottom=3, margin_end=3)
|
title_box = Gtk.Box(margin_start=12, margin_top=3, margin_bottom=3, margin_end=3)
|
||||||
title_box.append(Gtk.Label(label=self.language.get_name() if self.language else (language_name.title() if language_name else _("Code Block")), hexpand=True, xalign=0))
|
title_box.append(Gtk.Label(label=self.language.get_name() if self.language else _("Code Block"), hexpand=True, xalign=0))
|
||||||
copy_button = Gtk.Button(icon_name="edit-copy-symbolic", css_classes=["flat", "circular"], tooltip_text=_("Copy Message"))
|
copy_button = Gtk.Button(icon_name="edit-copy-symbolic", css_classes=["flat", "circular"], tooltip_text=_("Copy Message"))
|
||||||
copy_button.connect("clicked", lambda *_: self.on_copy())
|
copy_button.connect("clicked", lambda *_: self.on_copy())
|
||||||
title_box.append(copy_button)
|
title_box.append(copy_button)
|
||||||
if language_name and language_name.lower() in ['bash', 'python3']:
|
|
||||||
run_button = Gtk.Button(icon_name="execute-from-symbolic", css_classes=["flat", "circular"], tooltip_text=_("Run Script"))
|
|
||||||
run_button.connect("clicked", lambda *_: self.run_script(language_name))
|
|
||||||
title_box.append(run_button)
|
|
||||||
self.append(title_box)
|
self.append(title_box)
|
||||||
self.append(Gtk.Separator())
|
self.append(Gtk.Separator())
|
||||||
self.append(self.source_view)
|
self.append(self.source_view)
|
||||||
@ -176,18 +121,6 @@ class code_block(Gtk.Box):
|
|||||||
clipboard.set(text)
|
clipboard.set(text)
|
||||||
window.show_toast(_("Code copied to the clipboard"), window.main_overlay)
|
window.show_toast(_("Code copied to the clipboard"), window.main_overlay)
|
||||||
|
|
||||||
def run_script(self, language_name):
|
|
||||||
logger.debug("Running script")
|
|
||||||
start = self.buffer.get_start_iter()
|
|
||||||
end = self.buffer.get_end_iter()
|
|
||||||
dialog_widget.simple(
|
|
||||||
_('Run Script'),
|
|
||||||
_('Make sure you understand what this script does before running it, Alpaca is not responsible for any damages to your device or data'),
|
|
||||||
lambda script=self.buffer.get_text(start, end, False), language_name=language_name: terminal_widget.run_terminal(script, language_name),
|
|
||||||
_('Execute'),
|
|
||||||
'destructive'
|
|
||||||
)
|
|
||||||
|
|
||||||
class attachment(Gtk.Button):
|
class attachment(Gtk.Button):
|
||||||
__gtype_name__ = 'AlpacaAttachment'
|
__gtype_name__ = 'AlpacaAttachment'
|
||||||
|
|
||||||
@ -230,8 +163,7 @@ class attachment_container(Gtk.ScrolledWindow):
|
|||||||
|
|
||||||
self.container = Gtk.Box(
|
self.container = Gtk.Box(
|
||||||
orientation=0,
|
orientation=0,
|
||||||
spacing=10,
|
spacing=12
|
||||||
valign=1
|
|
||||||
)
|
)
|
||||||
|
|
||||||
super().__init__(
|
super().__init__(
|
||||||
@ -239,8 +171,7 @@ class attachment_container(Gtk.ScrolledWindow):
|
|||||||
margin_start=10,
|
margin_start=10,
|
||||||
margin_end=10,
|
margin_end=10,
|
||||||
hexpand=True,
|
hexpand=True,
|
||||||
child=self.container,
|
child=self.container
|
||||||
vscrollbar_policy=2
|
|
||||||
)
|
)
|
||||||
|
|
||||||
def add_file(self, file:attachment):
|
def add_file(self, file:attachment):
|
||||||
@ -296,7 +227,6 @@ class image(Gtk.Button):
|
|||||||
tooltip_text=_("Missing Image")
|
tooltip_text=_("Missing Image")
|
||||||
)
|
)
|
||||||
image_texture.update_property([4], [_("Missing image")])
|
image_texture.update_property([4], [_("Missing image")])
|
||||||
self.set_overflow(1)
|
|
||||||
self.connect("clicked", lambda button, file_path=os.path.join(head, '{selected_chat}', last_dir, file_name): window.preview_file(file_path, 'image', None))
|
self.connect("clicked", lambda button, file_path=os.path.join(head, '{selected_chat}', last_dir, file_name): window.preview_file(file_path, 'image', None))
|
||||||
|
|
||||||
class image_container(Gtk.ScrolledWindow):
|
class image_container(Gtk.ScrolledWindow):
|
||||||
@ -415,9 +345,6 @@ class action_buttons(Gtk.Box):
|
|||||||
def regenerate_message(self):
|
def regenerate_message(self):
|
||||||
chat = self.get_parent().get_parent().get_parent().get_parent().get_parent()
|
chat = self.get_parent().get_parent().get_parent().get_parent().get_parent()
|
||||||
message_element = self.get_parent()
|
message_element = self.get_parent()
|
||||||
if message_element.spinner:
|
|
||||||
message_element.container.remove(message_element.spinner)
|
|
||||||
message_element.spinner = None
|
|
||||||
if not chat.busy:
|
if not chat.busy:
|
||||||
message_element.set_text()
|
message_element.set_text()
|
||||||
if message_element.footer:
|
if message_element.footer:
|
||||||
@ -469,15 +396,10 @@ class message(Gtk.Overlay):
|
|||||||
orientation=1,
|
orientation=1,
|
||||||
halign='fill',
|
halign='fill',
|
||||||
css_classes=["response_message"] if self.bot else ["card", "user_message"],
|
css_classes=["response_message"] if self.bot else ["card", "user_message"],
|
||||||
spacing=5,
|
spacing=12
|
||||||
width_request=-1 if self.bot else 375
|
|
||||||
)
|
)
|
||||||
|
|
||||||
super().__init__(
|
super().__init__(css_classes=["message"], name=message_id)
|
||||||
css_classes=["message"],
|
|
||||||
name=message_id,
|
|
||||||
halign=0 if self.bot else 2
|
|
||||||
)
|
|
||||||
self.set_child(self.container)
|
self.set_child(self.container)
|
||||||
|
|
||||||
def add_attachments(self, attachments:dict):
|
def add_attachments(self, attachments:dict):
|
||||||
@ -503,8 +425,6 @@ class message(Gtk.Overlay):
|
|||||||
if not self.action_buttons:
|
if not self.action_buttons:
|
||||||
self.action_buttons = action_buttons(self.bot)
|
self.action_buttons = action_buttons(self.bot)
|
||||||
self.add_overlay(self.action_buttons)
|
self.add_overlay(self.action_buttons)
|
||||||
if not self.text:
|
|
||||||
self.action_buttons.set_visible(False)
|
|
||||||
|
|
||||||
def update_message(self, data:dict):
|
def update_message(self, data:dict):
|
||||||
chat = self.get_parent().get_parent().get_parent().get_parent()
|
chat = self.get_parent().get_parent().get_parent().get_parent()
|
||||||
@ -517,7 +437,7 @@ class message(Gtk.Overlay):
|
|||||||
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper())
|
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper())
|
||||||
elif vadjustment.get_value() + 50 >= vadjustment.get_upper() - vadjustment.get_page_size():
|
elif vadjustment.get_value() + 50 >= vadjustment.get_upper() - vadjustment.get_page_size():
|
||||||
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper() - vadjustment.get_page_size())
|
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper() - vadjustment.get_page_size())
|
||||||
GLib.idle_add(self.content_children[-1].insert_at_end, data['message']['content'], False)
|
self.content_children[-1].insert_at_end(data['message']['content'], False)
|
||||||
if 'done' in data and data['done']:
|
if 'done' in data and data['done']:
|
||||||
window.chat_list_box.get_tab_by_name(chat.get_name()).spinner.set_visible(False)
|
window.chat_list_box.get_tab_by_name(chat.get_name()).spinner.set_visible(False)
|
||||||
if window.chat_list_box.get_current_chat().get_name() != chat.get_name():
|
if window.chat_list_box.get_current_chat().get_name() != chat.get_name():
|
||||||
@ -526,19 +446,12 @@ class message(Gtk.Overlay):
|
|||||||
chat.container.remove(chat.welcome_screen)
|
chat.container.remove(chat.welcome_screen)
|
||||||
chat.welcome_screen = None
|
chat.welcome_screen = None
|
||||||
chat.stop_message()
|
chat.stop_message()
|
||||||
self.text = self.content_children[-1].get_label()
|
self.set_text(self.content_children[-1].get_label())
|
||||||
GLib.idle_add(self.set_text, self.content_children[-1].get_label())
|
|
||||||
self.dt = datetime.datetime.now()
|
self.dt = datetime.datetime.now()
|
||||||
GLib.idle_add(self.add_footer, self.dt)
|
self.add_footer(self.dt)
|
||||||
window.show_notification(chat.get_name(), self.text[:200] + (self.text[200:] and '...'), Gio.ThemedIcon.new("chat-message-new-symbolic"))
|
window.show_notification(chat.get_name(), self.text[:200] + (self.text[200:] and '...'), Gio.ThemedIcon.new("chat-message-new-symbolic"))
|
||||||
GLib.idle_add(window.save_history, chat)
|
window.save_history(chat)
|
||||||
else:
|
else:
|
||||||
if self.spinner:
|
|
||||||
GLib.idle_add(self.container.remove, self.spinner)
|
|
||||||
self.spinner = None
|
|
||||||
chat_tab = window.chat_list_box.get_tab_by_name(chat.get_name())
|
|
||||||
if chat_tab.spinner:
|
|
||||||
GLib.idle_add(chat_tab.spinner.set_visible, False)
|
|
||||||
sys.exit()
|
sys.exit()
|
||||||
|
|
||||||
def set_text(self, text:str=None):
|
def set_text(self, text:str=None):
|
||||||
@ -548,14 +461,18 @@ class message(Gtk.Overlay):
|
|||||||
self.content_children = []
|
self.content_children = []
|
||||||
if text:
|
if text:
|
||||||
self.content_children = []
|
self.content_children = []
|
||||||
code_block_pattern = re.compile(r'```(\w*)\n(.*?)\n\s*```', re.DOTALL)
|
code_block_pattern = re.compile(r'```(\w+)\n(.*?)\n```', re.DOTALL)
|
||||||
no_language_code_block_pattern = re.compile(r'`(\w*)\n(.*?)\n\s*`', re.DOTALL)
|
no_lang_code_block_pattern = re.compile(r'`\n(.*?)\n`', re.DOTALL)
|
||||||
table_pattern = re.compile(r'((\r?\n){2}|^)([^\r\n]*\|[^\r\n]*(\r?\n)?)+(?=(\r?\n){2}|$)', re.MULTILINE)
|
table_pattern = re.compile(r'((\r?\n){2}|^)([^\r\n]*\|[^\r\n]*(\r?\n)?)+(?=(\r?\n){2}|$)', re.MULTILINE)
|
||||||
|
bold_pattern = re.compile(r'\*\*(.*?)\*\*') #"**text**"
|
||||||
|
code_pattern = re.compile(r'`([^`\n]*?)`') #"`text`"
|
||||||
|
h1_pattern = re.compile(r'^#\s(.*)$') #"# text"
|
||||||
|
h2_pattern = re.compile(r'^##\s(.*)$') #"## text"
|
||||||
markup_pattern = re.compile(r'<(b|u|tt|span.*)>(.*?)<\/(b|u|tt|span)>') #heh butt span, I'm so funny
|
markup_pattern = re.compile(r'<(b|u|tt|span.*)>(.*?)<\/(b|u|tt|span)>') #heh butt span, I'm so funny
|
||||||
parts = []
|
parts = []
|
||||||
pos = 0
|
pos = 0
|
||||||
# Code blocks
|
# Code blocks
|
||||||
for match in code_block_pattern.finditer(self.text[pos:]):
|
for match in code_block_pattern.finditer(self.text):
|
||||||
start, end = match.span()
|
start, end = match.span()
|
||||||
if pos < start:
|
if pos < start:
|
||||||
normal_text = self.text[pos:start]
|
normal_text = self.text[pos:start]
|
||||||
@ -564,17 +481,17 @@ class message(Gtk.Overlay):
|
|||||||
code_text = match.group(2)
|
code_text = match.group(2)
|
||||||
parts.append({"type": "code", "text": code_text, "language": 'python3' if language == 'python' else language})
|
parts.append({"type": "code", "text": code_text, "language": 'python3' if language == 'python' else language})
|
||||||
pos = end
|
pos = end
|
||||||
for match in no_language_code_block_pattern.finditer(self.text[pos:]):
|
# Code blocks (No language)
|
||||||
|
for match in no_lang_code_block_pattern.finditer(self.text):
|
||||||
start, end = match.span()
|
start, end = match.span()
|
||||||
if pos < start:
|
if pos < start:
|
||||||
normal_text = self.text[pos:start]
|
normal_text = self.text[pos:start]
|
||||||
parts.append({"type": "normal", "text": normal_text.strip()})
|
parts.append({"type": "normal", "text": normal_text.strip()})
|
||||||
language = match.group(1)
|
code_text = match.group(1)
|
||||||
code_text = match.group(2)
|
|
||||||
parts.append({"type": "code", "text": code_text, "language": None})
|
parts.append({"type": "code", "text": code_text, "language": None})
|
||||||
pos = end
|
pos = end
|
||||||
# Tables
|
# Tables
|
||||||
for match in table_pattern.finditer(self.text[pos:]):
|
for match in table_pattern.finditer(self.text):
|
||||||
start, end = match.span()
|
start, end = match.span()
|
||||||
if pos < start:
|
if pos < start:
|
||||||
normal_text = self.text[pos:start]
|
normal_text = self.text[pos:start]
|
||||||
@ -583,8 +500,8 @@ class message(Gtk.Overlay):
|
|||||||
parts.append({"type": "table", "text": table_text})
|
parts.append({"type": "table", "text": table_text})
|
||||||
pos = end
|
pos = end
|
||||||
# Text blocks
|
# Text blocks
|
||||||
if pos < len(self.text):
|
if pos < len(text):
|
||||||
normal_text = self.text[pos:]
|
normal_text = text[pos:]
|
||||||
if normal_text.strip():
|
if normal_text.strip():
|
||||||
parts.append({"type": "normal", "text": normal_text.strip()})
|
parts.append({"type": "normal", "text": normal_text.strip()})
|
||||||
|
|
||||||
@ -592,12 +509,10 @@ class message(Gtk.Overlay):
|
|||||||
if part['type'] == 'normal':
|
if part['type'] == 'normal':
|
||||||
text_b = text_block(self.bot)
|
text_b = text_block(self.bot)
|
||||||
part['text'] = part['text'].replace("\n* ", "\n• ")
|
part['text'] = part['text'].replace("\n* ", "\n• ")
|
||||||
part['text'] = re.sub(r'`([^`\n]*?)`', r'<tt>\1</tt>', part['text'])
|
part['text'] = code_pattern.sub(r'<tt>\1</tt>', part['text'])
|
||||||
part['text'] = re.sub(r'\*\*(.*?)\*\*', r'<b>\1</b>', part['text'], flags=re.MULTILINE)
|
part['text'] = bold_pattern.sub(r'<b>\1</b>', part['text'])
|
||||||
part['text'] = re.sub(r'^#\s+(.*)', r'<span size="x-large">\1</span>', part['text'], flags=re.MULTILINE)
|
part['text'] = h1_pattern.sub(r'<span size="x-large">\1</span>', part['text'])
|
||||||
part['text'] = re.sub(r'^##\s+(.*)', r'<span size="large">\1</span>', part['text'], flags=re.MULTILINE)
|
part['text'] = h2_pattern.sub(r'<span size="large">\1</span>', part['text'])
|
||||||
part['text'] = re.sub(r'_(\((.*?)\)|\d+)', r'<sub>\2\1</sub>', part['text'], flags=re.MULTILINE)
|
|
||||||
part['text'] = re.sub(r'\^(\((.*?)\)|\d+)', r'<sup>\2\1</sup>', part['text'], flags=re.MULTILINE)
|
|
||||||
pos = 0
|
pos = 0
|
||||||
for match in markup_pattern.finditer(part['text']):
|
for match in markup_pattern.finditer(part['text']):
|
||||||
start, end = match.span()
|
start, end = match.span()
|
||||||
@ -623,12 +538,8 @@ class message(Gtk.Overlay):
|
|||||||
text_b = text_block(self.bot)
|
text_b = text_block(self.bot)
|
||||||
text_b.set_visible(False)
|
text_b.set_visible(False)
|
||||||
self.content_children.append(text_b)
|
self.content_children.append(text_b)
|
||||||
if self.spinner:
|
self.spinner = Gtk.Spinner(spinning=True, margin_top=12, margin_bottom=12, hexpand=True)
|
||||||
self.container.remove(self.spinner)
|
|
||||||
self.spinner = None
|
|
||||||
self.spinner = Gtk.Spinner(spinning=True, margin_top=10, margin_bottom=10, hexpand=True)
|
|
||||||
self.container.append(self.spinner)
|
self.container.append(self.spinner)
|
||||||
self.container.append(text_b)
|
self.container.append(text_b)
|
||||||
self.container.queue_draw()
|
self.container.queue_draw()
|
||||||
|
|
||||||
|
|
||||||
|
@ -7,10 +7,9 @@ import gi
|
|||||||
gi.require_version('Gtk', '4.0')
|
gi.require_version('Gtk', '4.0')
|
||||||
gi.require_version('GtkSource', '5')
|
gi.require_version('GtkSource', '5')
|
||||||
from gi.repository import Gtk, GObject, Gio, Adw, GtkSource, GLib, Gdk
|
from gi.repository import Gtk, GObject, Gio, Adw, GtkSource, GLib, Gdk
|
||||||
import logging, os, datetime, re, shutil, threading, json, sys, glob
|
import logging, os, datetime, re, shutil, threading, json, sys
|
||||||
from ..internal import config_dir, data_dir, cache_dir, source_dir
|
from ..internal import config_dir, data_dir, cache_dir, source_dir
|
||||||
from .. import available_models_descriptions
|
from .. import available_models_descriptions, dialogs
|
||||||
from . import dialog_widget
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@ -53,24 +52,6 @@ class model_selector_popup(Gtk.Popover):
|
|||||||
child=scroller
|
child=scroller
|
||||||
)
|
)
|
||||||
|
|
||||||
class model_selector_row(Gtk.ListBoxRow):
|
|
||||||
__gtype_name__ = 'AlpacaModelSelectorRow'
|
|
||||||
|
|
||||||
def __init__(self, model_name:str, data:dict):
|
|
||||||
super().__init__(
|
|
||||||
child = Gtk.Label(
|
|
||||||
label=window.convert_model_name(model_name, 0),
|
|
||||||
halign=1,
|
|
||||||
hexpand=True
|
|
||||||
),
|
|
||||||
halign=0,
|
|
||||||
hexpand=True,
|
|
||||||
name=model_name,
|
|
||||||
tooltip_text=window.convert_model_name(model_name, 0)
|
|
||||||
)
|
|
||||||
self.data = data
|
|
||||||
self.image_recognition = 'projector_info' in self.data
|
|
||||||
|
|
||||||
class model_selector_button(Gtk.MenuButton):
|
class model_selector_button(Gtk.MenuButton):
|
||||||
__gtype_name__ = 'AlpacaModelSelectorButton'
|
__gtype_name__ = 'AlpacaModelSelectorButton'
|
||||||
|
|
||||||
@ -82,13 +63,13 @@ class model_selector_button(Gtk.MenuButton):
|
|||||||
orientation=0,
|
orientation=0,
|
||||||
spacing=5
|
spacing=5
|
||||||
)
|
)
|
||||||
self.label = Gtk.Label()
|
self.label = Gtk.Label(label=_('Select a Model'))
|
||||||
container.append(self.label)
|
container.append(self.label)
|
||||||
container.append(Gtk.Image.new_from_icon_name("down-symbolic"))
|
container.append(Gtk.Image.new_from_icon_name("down-symbolic"))
|
||||||
super().__init__(
|
super().__init__(
|
||||||
|
tooltip_text=_('Select a Model'),
|
||||||
child=container,
|
child=container,
|
||||||
popover=self.popover,
|
popover=self.popover
|
||||||
halign=3
|
|
||||||
)
|
)
|
||||||
|
|
||||||
def change_model(self, model_name:str):
|
def change_model(self, model_name:str):
|
||||||
@ -104,28 +85,28 @@ class model_selector_button(Gtk.MenuButton):
|
|||||||
self.label.set_label(window.convert_model_name(model_name, 0))
|
self.label.set_label(window.convert_model_name(model_name, 0))
|
||||||
self.set_tooltip_text(window.convert_model_name(model_name, 0))
|
self.set_tooltip_text(window.convert_model_name(model_name, 0))
|
||||||
elif len(list(listbox)) == 0:
|
elif len(list(listbox)) == 0:
|
||||||
window.title_stack.set_visible_child_name('no_models')
|
self.label.set_label(_("Select a Model"))
|
||||||
|
self.set_tooltip_text(_("Select a Model"))
|
||||||
window.model_manager.verify_if_image_can_be_used()
|
window.model_manager.verify_if_image_can_be_used()
|
||||||
|
|
||||||
def add_model(self, model_name:str):
|
def add_model(self, model_name:str):
|
||||||
data = None
|
model_row = Gtk.ListBoxRow(
|
||||||
response = window.ollama_instance.request("POST", "api/show", json.dumps({"name": model_name}))
|
child = Gtk.Label(
|
||||||
if response.status_code != 200:
|
label=window.convert_model_name(model_name, 0),
|
||||||
logger.error(f"Status code was {response.status_code}")
|
halign=1,
|
||||||
return
|
hexpand=True
|
||||||
try:
|
),
|
||||||
data = json.loads(response.text)
|
halign=0,
|
||||||
except Exception as e:
|
hexpand=True,
|
||||||
logger.error(f"Error fetching 'api - show' info: {str(e)}")
|
name=model_name,
|
||||||
model_row = model_selector_row(model_name, data)
|
tooltip_text=window.convert_model_name(model_name, 0)
|
||||||
GLib.idle_add(self.get_popover().model_list_box.append, model_row)
|
)
|
||||||
GLib.idle_add(self.change_model, model_name)
|
self.get_popover().model_list_box.append(model_row)
|
||||||
GLib.idle_add(window.title_stack.set_visible_child_name, 'model_selector')
|
self.change_model(model_name)
|
||||||
|
|
||||||
def remove_model(self, model_name:str):
|
def remove_model(self, model_name:str):
|
||||||
self.get_popover().model_list_box.remove(next((model for model in list(self.get_popover().model_list_box) if model.get_name() == model_name), None))
|
self.get_popover().model_list_box.remove(next((model for model in list(self.get_popover().model_list_box) if model.get_name() == model_name), None))
|
||||||
self.model_changed(self.get_popover().model_list_box)
|
self.model_changed(self.get_popover().model_list_box)
|
||||||
window.title_stack.set_visible_child_name('model_selector' if len(window.model_manager.get_model_list()) > 0 else 'no_models')
|
|
||||||
|
|
||||||
def clear_list(self):
|
def clear_list(self):
|
||||||
self.get_popover().model_list_box.remove_all()
|
self.get_popover().model_list_box.remove_all()
|
||||||
@ -177,16 +158,10 @@ class pulling_model(Gtk.ListBoxRow):
|
|||||||
icon_name = "media-playback-stop-symbolic",
|
icon_name = "media-playback-stop-symbolic",
|
||||||
vexpand = False,
|
vexpand = False,
|
||||||
valign = 3,
|
valign = 3,
|
||||||
css_classes = ["error", "circular"],
|
css_classes = ["destructive-action", "circular"],
|
||||||
tooltip_text = _("Stop Pulling '{}'").format(window.convert_model_name(model_name, 0))
|
tooltip_text = _("Stop Pulling '{}'").format(window.convert_model_name(model_name, 0))
|
||||||
)
|
)
|
||||||
stop_button.connect('clicked', lambda *i: dialog_widget.simple(
|
stop_button.connect('clicked', lambda *_: dialogs.stop_pull_model(window, self))
|
||||||
_('Stop Download?'),
|
|
||||||
_("Are you sure you want to stop pulling '{}'?").format(window.convert_model_name(self.get_name(), 0)),
|
|
||||||
self.stop,
|
|
||||||
_('Stop'),
|
|
||||||
'destructive'
|
|
||||||
))
|
|
||||||
|
|
||||||
container_box = Gtk.Box(
|
container_box = Gtk.Box(
|
||||||
hexpand=True,
|
hexpand=True,
|
||||||
@ -207,27 +182,9 @@ class pulling_model(Gtk.ListBoxRow):
|
|||||||
name=model_name
|
name=model_name
|
||||||
)
|
)
|
||||||
self.error = None
|
self.error = None
|
||||||
self.digests = []
|
|
||||||
|
|
||||||
def stop(self):
|
|
||||||
if len(list(self.get_parent())) == 1:
|
|
||||||
self.get_parent().set_visible(False)
|
|
||||||
self.get_parent().remove(self)
|
|
||||||
|
|
||||||
def update(self, data):
|
def update(self, data):
|
||||||
if 'digest' in data and data['digest'] not in self.digests:
|
|
||||||
self.digests.append(data['digest'].replace(':', '-'))
|
|
||||||
if not self.get_parent():
|
if not self.get_parent():
|
||||||
logger.info("Pulling of '{}' was canceled".format(self.get_name()))
|
|
||||||
directory = os.path.join(data_dir, '.ollama', 'models', 'blobs')
|
|
||||||
for digest in self.digests:
|
|
||||||
files_to_delete = glob.glob(os.path.join(directory, digest + '*'))
|
|
||||||
for file in files_to_delete:
|
|
||||||
logger.info("Deleting '{}'".format(file))
|
|
||||||
try:
|
|
||||||
os.remove(file)
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"Can't delete file {file}: {e}")
|
|
||||||
sys.exit()
|
sys.exit()
|
||||||
if 'error' in data:
|
if 'error' in data:
|
||||||
self.error = data['error']
|
self.error = data['error']
|
||||||
@ -249,37 +206,6 @@ class pulling_model_list(Gtk.ListBox):
|
|||||||
visible=False
|
visible=False
|
||||||
)
|
)
|
||||||
|
|
||||||
class information_bow(Gtk.Box):
|
|
||||||
__gtype_name__ = 'AlpacaModelInformationBow'
|
|
||||||
|
|
||||||
def __init__(self, title:str, subtitle:str):
|
|
||||||
self.title = title
|
|
||||||
self.subtitle = subtitle
|
|
||||||
title_label = Gtk.Label(
|
|
||||||
label=self.title,
|
|
||||||
css_classes=['subtitle', 'caption', 'dim-label'],
|
|
||||||
hexpand=True,
|
|
||||||
margin_top=10,
|
|
||||||
margin_start=0,
|
|
||||||
margin_end=0
|
|
||||||
)
|
|
||||||
subtitle_label = Gtk.Label(
|
|
||||||
label=self.subtitle if self.subtitle else '(none)',
|
|
||||||
css_classes=['heading'],
|
|
||||||
hexpand=True,
|
|
||||||
margin_bottom=10,
|
|
||||||
margin_start=0,
|
|
||||||
margin_end=0
|
|
||||||
)
|
|
||||||
super().__init__(
|
|
||||||
spacing=5,
|
|
||||||
orientation=1,
|
|
||||||
css_classes=['card']
|
|
||||||
)
|
|
||||||
self.append(title_label)
|
|
||||||
self.append(subtitle_label)
|
|
||||||
|
|
||||||
|
|
||||||
class local_model(Gtk.ListBoxRow):
|
class local_model(Gtk.ListBoxRow):
|
||||||
__gtype_name__ = 'AlpacaLocalModel'
|
__gtype_name__ = 'AlpacaLocalModel'
|
||||||
|
|
||||||
@ -307,31 +233,14 @@ class local_model(Gtk.ListBoxRow):
|
|||||||
description_box.append(model_label)
|
description_box.append(model_label)
|
||||||
description_box.append(tag_label)
|
description_box.append(tag_label)
|
||||||
|
|
||||||
info_button = Gtk.Button(
|
|
||||||
icon_name = "info-outline-symbolic",
|
|
||||||
vexpand = False,
|
|
||||||
valign = 3,
|
|
||||||
css_classes = ["circular"],
|
|
||||||
tooltip_text = _("Details")
|
|
||||||
)
|
|
||||||
|
|
||||||
info_button.connect('clicked', self.show_information)
|
|
||||||
|
|
||||||
delete_button = Gtk.Button(
|
delete_button = Gtk.Button(
|
||||||
icon_name = "user-trash-symbolic",
|
icon_name = "user-trash-symbolic",
|
||||||
vexpand = False,
|
vexpand = False,
|
||||||
valign = 3,
|
valign = 3,
|
||||||
css_classes = ["error", "circular"],
|
css_classes = ["destructive-action", "circular"],
|
||||||
tooltip_text = _("Remove '{}'").format(window.convert_model_name(model_name, 0))
|
tooltip_text = _("Remove '{}'").format(window.convert_model_name(model_name, 0))
|
||||||
)
|
)
|
||||||
|
delete_button.connect('clicked', lambda *_, model_name=model_name: dialogs.delete_model(window, model_name))
|
||||||
delete_button.connect('clicked', lambda *i: dialog_widget.simple(
|
|
||||||
_('Delete Model?'),
|
|
||||||
_("Are you sure you want to delete '{}'?").format(model_title),
|
|
||||||
lambda model_name=model_name: window.model_manager.remove_local_model(model_name),
|
|
||||||
_('Delete'),
|
|
||||||
'destructive'
|
|
||||||
))
|
|
||||||
|
|
||||||
container_box = Gtk.Box(
|
container_box = Gtk.Box(
|
||||||
hexpand=True,
|
hexpand=True,
|
||||||
@ -344,7 +253,6 @@ class local_model(Gtk.ListBoxRow):
|
|||||||
margin_end=10
|
margin_end=10
|
||||||
)
|
)
|
||||||
container_box.append(description_box)
|
container_box.append(description_box)
|
||||||
container_box.append(info_button)
|
|
||||||
container_box.append(delete_button)
|
container_box.append(delete_button)
|
||||||
|
|
||||||
super().__init__(
|
super().__init__(
|
||||||
@ -352,53 +260,6 @@ class local_model(Gtk.ListBoxRow):
|
|||||||
name=model_name
|
name=model_name
|
||||||
)
|
)
|
||||||
|
|
||||||
def show_information(self, button):
|
|
||||||
model = next((element for element in list(window.model_manager.model_selector.get_popover().model_list_box) if element.get_name() == self.get_name()), None)
|
|
||||||
model_name = model.get_child().get_label()
|
|
||||||
|
|
||||||
window.model_detail_page.set_title(' ('.join(model_name.split(' (')[:-1]))
|
|
||||||
window.model_detail_page.set_description(' ('.join(model_name.split(' (')[-1:])[:-1])
|
|
||||||
window.model_detail_create_button.set_name(model_name)
|
|
||||||
window.model_detail_create_button.set_tooltip_text(_("Create Model Based on '{}'").format(model_name))
|
|
||||||
|
|
||||||
details_flow_box = Gtk.FlowBox(
|
|
||||||
valign=1,
|
|
||||||
hexpand=True,
|
|
||||||
vexpand=False,
|
|
||||||
selection_mode=0,
|
|
||||||
max_children_per_line=2,
|
|
||||||
min_children_per_line=1,
|
|
||||||
margin_top=12,
|
|
||||||
margin_bottom=12,
|
|
||||||
margin_start=12,
|
|
||||||
margin_end=12
|
|
||||||
)
|
|
||||||
|
|
||||||
translation_strings={
|
|
||||||
'modified_at': _('Modified At'),
|
|
||||||
'parent_model': _('Parent Model'),
|
|
||||||
'format': _('Format'),
|
|
||||||
'family': _('Family'),
|
|
||||||
'parameter_size': _('Parameter Size'),
|
|
||||||
'quantization_level': _('Quantization Level')
|
|
||||||
}
|
|
||||||
|
|
||||||
if 'modified_at' in model.data and model.data['modified_at']:
|
|
||||||
details_flow_box.append(information_bow(
|
|
||||||
title=translation_strings['modified_at'],
|
|
||||||
subtitle=datetime.datetime.strptime(':'.join(model.data['modified_at'].split(':')[:2]), '%Y-%m-%dT%H:%M').strftime('%Y-%m-%d %H:%M')
|
|
||||||
))
|
|
||||||
|
|
||||||
for name, value in model.data['details'].items():
|
|
||||||
if isinstance(value, str):
|
|
||||||
details_flow_box.append(information_bow(
|
|
||||||
title=translation_strings[name] if name in translation_strings else name.replace('_', ' ').title(),
|
|
||||||
subtitle=value
|
|
||||||
))
|
|
||||||
|
|
||||||
window.model_detail_page.set_child(details_flow_box)
|
|
||||||
window.navigation_view_manage_models.push_by_tag('model_information')
|
|
||||||
|
|
||||||
class local_model_list(Gtk.ListBox):
|
class local_model_list(Gtk.ListBox):
|
||||||
__gtype_name__ = 'AlpacaLocalModelList'
|
__gtype_name__ = 'AlpacaLocalModelList'
|
||||||
|
|
||||||
@ -411,7 +272,7 @@ class local_model_list(Gtk.ListBox):
|
|||||||
|
|
||||||
def add_model(self, model_name:str):
|
def add_model(self, model_name:str):
|
||||||
model = local_model(model_name)
|
model = local_model(model_name)
|
||||||
GLib.idle_add(self.append, model)
|
self.append(model)
|
||||||
if not self.get_visible():
|
if not self.get_visible():
|
||||||
self.set_visible(True)
|
self.set_visible(True)
|
||||||
|
|
||||||
@ -431,9 +292,7 @@ class available_model(Gtk.ListBoxRow):
|
|||||||
label="<b>{}</b> <small>by {}</small>".format(self.model_title, self.model_author),
|
label="<b>{}</b> <small>by {}</small>".format(self.model_title, self.model_author),
|
||||||
hexpand=True,
|
hexpand=True,
|
||||||
halign=1,
|
halign=1,
|
||||||
use_markup=True,
|
use_markup=True
|
||||||
wrap=True,
|
|
||||||
wrap_mode=0
|
|
||||||
)
|
)
|
||||||
description_label = Gtk.Label(
|
description_label = Gtk.Label(
|
||||||
css_classes=["subtitle"],
|
css_classes=["subtitle"],
|
||||||
@ -555,7 +414,6 @@ class model_manager_container(Gtk.Box):
|
|||||||
spacing=12,
|
spacing=12,
|
||||||
orientation=1
|
orientation=1
|
||||||
)
|
)
|
||||||
|
|
||||||
self.pulling_list = pulling_model_list()
|
self.pulling_list = pulling_model_list()
|
||||||
self.append(self.pulling_list)
|
self.append(self.pulling_list)
|
||||||
self.local_list = local_model_list()
|
self.local_list = local_model_list()
|
||||||
@ -563,7 +421,7 @@ class model_manager_container(Gtk.Box):
|
|||||||
self.available_list = available_model_list()
|
self.available_list = available_model_list()
|
||||||
self.append(self.available_list)
|
self.append(self.available_list)
|
||||||
self.model_selector = model_selector_button()
|
self.model_selector = model_selector_button()
|
||||||
window.title_stack.add_named(self.model_selector, 'model_selector')
|
window.header_bar.set_title_widget(self.model_selector)
|
||||||
|
|
||||||
def add_local_model(self, model_name:str):
|
def add_local_model(self, model_name:str):
|
||||||
self.local_list.add_model(model_name)
|
self.local_list.add_model(model_name)
|
||||||
@ -599,7 +457,6 @@ class model_manager_container(Gtk.Box):
|
|||||||
try:
|
try:
|
||||||
response = window.ollama_instance.request("GET", "api/tags")
|
response = window.ollama_instance.request("GET", "api/tags")
|
||||||
if response.status_code == 200:
|
if response.status_code == 200:
|
||||||
self.model_selector.popover.model_list_box.remove_all()
|
|
||||||
self.local_list.remove_all()
|
self.local_list.remove_all()
|
||||||
data = json.loads(response.text)
|
data = json.loads(response.text)
|
||||||
if len(data['models']) == 0:
|
if len(data['models']) == 0:
|
||||||
@ -607,15 +464,12 @@ class model_manager_container(Gtk.Box):
|
|||||||
else:
|
else:
|
||||||
self.local_list.set_visible(True)
|
self.local_list.set_visible(True)
|
||||||
for model in data['models']:
|
for model in data['models']:
|
||||||
threading.Thread(target=self.add_local_model, args=(model['name'], )).start()
|
self.add_local_model(model['name'])
|
||||||
else:
|
else:
|
||||||
window.connection_error()
|
window.connection_error()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(e)
|
logger.error(e)
|
||||||
window.connection_error()
|
window.connection_error()
|
||||||
window.title_stack.set_visible_child_name('model_selector' if len(window.model_manager.get_model_list()) > 0 else 'no_models')
|
|
||||||
#window.title_stack.set_visible_child_name('model_selector')
|
|
||||||
window.chat_list_box.update_welcome_screens(len(self.get_model_list()) > 0)
|
|
||||||
|
|
||||||
#Should only be called when the app starts
|
#Should only be called when the app starts
|
||||||
def update_available_list(self):
|
def update_available_list(self):
|
||||||
@ -628,16 +482,20 @@ class model_manager_container(Gtk.Box):
|
|||||||
|
|
||||||
def verify_if_image_can_be_used(self):
|
def verify_if_image_can_be_used(self):
|
||||||
logger.debug("Verifying if image can be used")
|
logger.debug("Verifying if image can be used")
|
||||||
selected = self.model_selector.get_popover().model_list_box.get_selected_row()
|
selected = self.get_selected_model()
|
||||||
if selected and selected.image_recognition:
|
if selected == None:
|
||||||
for name, content in window.attachments.items():
|
return False
|
||||||
if content['type'] == 'image':
|
selected = selected.split(":")[0]
|
||||||
content['button'].set_css_classes(["flat"])
|
with open(os.path.join(source_dir, 'available_models.json'), 'r', encoding="utf-8") as f:
|
||||||
return True
|
if selected in [key for key, value in json.load(f).items() if value["image"]]:
|
||||||
elif selected:
|
for name, content in window.attachments.items():
|
||||||
|
if content['type'] == 'image':
|
||||||
|
content['button'].set_css_classes(["flat"])
|
||||||
|
return True
|
||||||
for name, content in window.attachments.items():
|
for name, content in window.attachments.items():
|
||||||
if content['type'] == 'image':
|
if content['type'] == 'image':
|
||||||
content['button'].set_css_classes(["flat", "error"])
|
content['button'].set_css_classes(["flat", "error"])
|
||||||
|
return False
|
||||||
|
|
||||||
def pull_model(self, model_name:str, modelfile:str=None):
|
def pull_model(self, model_name:str, modelfile:str=None):
|
||||||
if ':' not in model_name:
|
if ':' not in model_name:
|
||||||
@ -671,3 +529,4 @@ class model_manager_container(Gtk.Box):
|
|||||||
GLib.idle_add(window.chat_list_box.update_welcome_screens, len(self.get_model_list()) > 0)
|
GLib.idle_add(window.chat_list_box.update_welcome_screens, len(self.get_model_list()) > 0)
|
||||||
if len(list(self.pulling_list)) == 0:
|
if len(list(self.pulling_list)) == 0:
|
||||||
GLib.idle_add(self.pulling_list.set_visible, False)
|
GLib.idle_add(self.pulling_list.set_visible, False)
|
||||||
|
|
||||||
|
@ -1,91 +0,0 @@
|
|||||||
#chat_widget.py
|
|
||||||
"""
|
|
||||||
Handles the terminal widget
|
|
||||||
"""
|
|
||||||
|
|
||||||
import gi
|
|
||||||
gi.require_version('Gtk', '4.0')
|
|
||||||
gi.require_version('Vte', '3.91')
|
|
||||||
from gi.repository import Gtk, Vte, GLib, Pango, GLib, Gdk
|
|
||||||
import logging, os, shutil, subprocess, re
|
|
||||||
from ..internal import data_dir
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
window = None
|
|
||||||
|
|
||||||
class terminal(Vte.Terminal):
|
|
||||||
__gtype_name__ = 'AlpacaTerminal'
|
|
||||||
|
|
||||||
def __init__(self, script:list):
|
|
||||||
super().__init__(css_classes=["terminal"])
|
|
||||||
self.set_font(Pango.FontDescription.from_string("Monospace 12"))
|
|
||||||
self.set_clear_background(False)
|
|
||||||
pty = Vte.Pty.new_sync(Vte.PtyFlags.DEFAULT, None)
|
|
||||||
|
|
||||||
self.set_pty(pty)
|
|
||||||
|
|
||||||
pty.spawn_async(
|
|
||||||
GLib.get_current_dir(),
|
|
||||||
script,
|
|
||||||
[],
|
|
||||||
GLib.SpawnFlags.DEFAULT,
|
|
||||||
None,
|
|
||||||
None,
|
|
||||||
-1,
|
|
||||||
None,
|
|
||||||
None
|
|
||||||
)
|
|
||||||
|
|
||||||
key_controller = Gtk.EventControllerKey()
|
|
||||||
key_controller.connect("key-pressed", self.on_key_press)
|
|
||||||
self.add_controller(key_controller)
|
|
||||||
|
|
||||||
def on_key_press(self, controller, keyval, keycode, state):
|
|
||||||
ctrl = state & Gdk.ModifierType.CONTROL_MASK
|
|
||||||
shift = state & Gdk.ModifierType.SHIFT_MASK
|
|
||||||
if ctrl and keyval == Gdk.KEY_c:
|
|
||||||
self.copy_clipboard()
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
def show_terminal(script):
|
|
||||||
window.terminal_scroller.set_child(terminal(script))
|
|
||||||
window.terminal_dialog.present(window)
|
|
||||||
|
|
||||||
def run_terminal(script:str, language_name:str):
|
|
||||||
logger.info('Running: \n{}'.format(language_name))
|
|
||||||
if language_name == 'python3':
|
|
||||||
if not os.path.isdir(os.path.join(data_dir, 'pyenv')):
|
|
||||||
os.mkdir(os.path.join(data_dir, 'pyenv'))
|
|
||||||
with open(os.path.join(data_dir, 'pyenv', 'main.py'), 'w') as f:
|
|
||||||
f.write(script)
|
|
||||||
script = [
|
|
||||||
'echo "🐍 {}\n"'.format(_('Setting up Python environment...')),
|
|
||||||
'python3 -m venv "{}"'.format(os.path.join(data_dir, 'pyenv')),
|
|
||||||
'{} {}'.format(os.path.join(data_dir, 'pyenv', 'bin', 'python3').replace(' ', '\\ '), os.path.join(data_dir, 'pyenv', 'main.py').replace(' ', '\\ '))
|
|
||||||
]
|
|
||||||
if os.path.isfile(os.path.join(data_dir, 'pyenv', 'requirements.txt')):
|
|
||||||
script.insert(1, '{} install -r {} | grep -v "already satisfied"; clear'.format(os.path.join(data_dir, 'pyenv', 'bin', 'pip3'), os.path.join(data_dir, 'pyenv', 'requirements.txt')))
|
|
||||||
else:
|
|
||||||
with open(os.path.join(data_dir, 'pyenv', 'requirements.txt'), 'w') as f:
|
|
||||||
f.write('')
|
|
||||||
script = ';\n'.join(script)
|
|
||||||
|
|
||||||
script += '; echo "\n🦙 {}"'.format(_('Script exited'))
|
|
||||||
if language_name == 'bash':
|
|
||||||
script = re.sub(r'(?m)^\s*sudo', 'pkexec', script)
|
|
||||||
if shutil.which('flatpak-spawn') and language_name == 'bash':
|
|
||||||
sandbox = True
|
|
||||||
try:
|
|
||||||
process = subprocess.run(['flatpak-spawn', '--host', 'bash', '-c', 'echo "test"'], check=True)
|
|
||||||
sandbox = False
|
|
||||||
except Exception as e:
|
|
||||||
pass
|
|
||||||
if sandbox:
|
|
||||||
script = 'echo "🦙 {}\n";'.format(_('The script is contained inside Flatpak')) + script
|
|
||||||
show_terminal(['bash', '-c', script])
|
|
||||||
else:
|
|
||||||
show_terminal(['flatpak-spawn', '--host', 'bash', '-c', script])
|
|
||||||
else:
|
|
||||||
show_terminal(['bash', '-c', script])
|
|
418
src/dialogs.py
Normal file
418
src/dialogs.py
Normal file
@ -0,0 +1,418 @@
|
|||||||
|
# dialogs.py
|
||||||
|
"""
|
||||||
|
Handles UI dialogs
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import logging, requests, threading, shutil
|
||||||
|
from pytube import YouTube
|
||||||
|
from html2text import html2text
|
||||||
|
from gi.repository import Adw, Gtk
|
||||||
|
from .internal import cache_dir
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
# CLEAR CHAT | WORKS
|
||||||
|
|
||||||
|
def clear_chat_response(self, dialog, task):
|
||||||
|
if dialog.choose_finish(task) == "clear":
|
||||||
|
self.chat_list_box.get_current_chat().show_welcome_screen(len(self.model_manager.get_model_list()) > 0)
|
||||||
|
self.save_history(self.chat_list_box.get_current_chat())
|
||||||
|
|
||||||
|
def clear_chat(self):
|
||||||
|
if self.chat_list_box.get_current_chat().busy:
|
||||||
|
self.show_toast(_("Chat cannot be cleared while receiving a message"), self.main_overlay)
|
||||||
|
return
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Clear Chat?"),
|
||||||
|
body=_("Are you sure you want to clear the chat?"),
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("clear", _("Clear"))
|
||||||
|
dialog.set_response_appearance("clear", Adw.ResponseAppearance.DESTRUCTIVE)
|
||||||
|
dialog.set_default_response("clear")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task: clear_chat_response(self, dialog, task)
|
||||||
|
)
|
||||||
|
|
||||||
|
# DELETE CHAT | WORKS
|
||||||
|
|
||||||
|
def delete_chat_response(self, dialog, task, chat_name):
|
||||||
|
if dialog.choose_finish(task) == "delete":
|
||||||
|
self.chat_list_box.delete_chat(chat_name)
|
||||||
|
|
||||||
|
def delete_chat(self, chat_name):
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Delete Chat?"),
|
||||||
|
body=_("Are you sure you want to delete '{}'?").format(chat_name),
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("delete", _("Delete"))
|
||||||
|
dialog.set_response_appearance("delete", Adw.ResponseAppearance.DESTRUCTIVE)
|
||||||
|
dialog.set_default_response("delete")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, chat_name=chat_name: delete_chat_response(self, dialog, task, chat_name)
|
||||||
|
)
|
||||||
|
|
||||||
|
# RENAME CHAT | WORKS
|
||||||
|
|
||||||
|
def rename_chat_response(self, dialog, task, old_chat_name, entry):
|
||||||
|
if not entry:
|
||||||
|
return
|
||||||
|
new_chat_name = entry.get_text()
|
||||||
|
if old_chat_name == new_chat_name:
|
||||||
|
return
|
||||||
|
if new_chat_name and (task is None or dialog.choose_finish(task) == "rename"):
|
||||||
|
self.chat_list_box.rename_chat(old_chat_name, new_chat_name)
|
||||||
|
|
||||||
|
def rename_chat(self, chat_name):
|
||||||
|
entry = Gtk.Entry()
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Rename Chat?"),
|
||||||
|
body=_("Renaming '{}'").format(chat_name),
|
||||||
|
extra_child=entry,
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("rename", _("Rename"))
|
||||||
|
dialog.set_response_appearance("rename", Adw.ResponseAppearance.SUGGESTED)
|
||||||
|
dialog.set_default_response("rename")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, old_chat_name=chat_name, entry=entry: rename_chat_response(self, dialog, task, old_chat_name, entry)
|
||||||
|
)
|
||||||
|
|
||||||
|
# NEW CHAT | WORKS | UNUSED REASON: The 'Add Chat' button now creates a chat without a name AKA "New Chat"
|
||||||
|
|
||||||
|
def new_chat_response(self, dialog, task, entry):
|
||||||
|
chat_name = _("New Chat")
|
||||||
|
if entry is not None and entry.get_text() != "":
|
||||||
|
chat_name = entry.get_text()
|
||||||
|
if chat_name and (task is None or dialog.choose_finish(task) == "create"):
|
||||||
|
self.new_chat(chat_name)
|
||||||
|
|
||||||
|
|
||||||
|
def new_chat(self):
|
||||||
|
entry = Gtk.Entry()
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Create Chat?"),
|
||||||
|
body=_("Enter name for new chat"),
|
||||||
|
extra_child=entry,
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("create", _("Create"))
|
||||||
|
dialog.set_response_appearance("create", Adw.ResponseAppearance.SUGGESTED)
|
||||||
|
dialog.set_default_response("create")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, entry=entry: new_chat_response(self, dialog, task, entry)
|
||||||
|
)
|
||||||
|
|
||||||
|
# STOP PULL MODEL | WORKS
|
||||||
|
|
||||||
|
def stop_pull_model_response(self, dialog, task, pulling_model):
|
||||||
|
if dialog.choose_finish(task) == "stop":
|
||||||
|
if len(list(pulling_model.get_parent())) == 1:
|
||||||
|
pulling_model.get_parent().set_visible(False)
|
||||||
|
pulling_model.get_parent().remove(pulling_model)
|
||||||
|
|
||||||
|
def stop_pull_model(self, pulling_model):
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Stop Download?"),
|
||||||
|
body=_("Are you sure you want to stop pulling '{}'?").format(self.convert_model_name(pulling_model.get_name(), 0)),
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("stop", _("Stop"))
|
||||||
|
dialog.set_response_appearance("stop", Adw.ResponseAppearance.DESTRUCTIVE)
|
||||||
|
dialog.set_default_response("stop")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self.manage_models_dialog,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, model=pulling_model: stop_pull_model_response(self, dialog, task, model)
|
||||||
|
)
|
||||||
|
|
||||||
|
# DELETE MODEL | WORKS
|
||||||
|
|
||||||
|
def delete_model_response(self, dialog, task, model_name):
|
||||||
|
if dialog.choose_finish(task) == "delete":
|
||||||
|
self.model_manager.remove_local_model(model_name)
|
||||||
|
|
||||||
|
def delete_model(self, model_name):
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Delete Model?"),
|
||||||
|
body=_("Are you sure you want to delete '{}'?").format(self.convert_model_name(model_name, 0)),
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("delete", _("Delete"))
|
||||||
|
dialog.set_response_appearance("delete", Adw.ResponseAppearance.DESTRUCTIVE)
|
||||||
|
dialog.set_default_response("delete")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self.manage_models_dialog,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, model_name = model_name: delete_model_response(self, dialog, task, model_name)
|
||||||
|
)
|
||||||
|
|
||||||
|
# REMOVE IMAGE | WORKS
|
||||||
|
|
||||||
|
def remove_attached_file_response(self, dialog, task, name):
|
||||||
|
if dialog.choose_finish(task) == 'remove':
|
||||||
|
self.file_preview_dialog.close()
|
||||||
|
self.remove_attached_file(name)
|
||||||
|
|
||||||
|
def remove_attached_file(self, name):
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Remove Attachment?"),
|
||||||
|
body=_("Are you sure you want to remove attachment?"),
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("remove", _("Remove"))
|
||||||
|
dialog.set_response_appearance("remove", Adw.ResponseAppearance.DESTRUCTIVE)
|
||||||
|
dialog.set_default_response("remove")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, name=name: remove_attached_file_response(self, dialog, task, name)
|
||||||
|
)
|
||||||
|
|
||||||
|
# RECONNECT REMOTE | WORKS
|
||||||
|
|
||||||
|
def reconnect_remote_response(self, dialog, task, url_entry, bearer_entry):
|
||||||
|
response = dialog.choose_finish(task)
|
||||||
|
if not task or response == "remote":
|
||||||
|
self.remote_connection_entry.set_text(url_entry.get_text())
|
||||||
|
self.remote_connection_switch.set_sensitive(url_entry.get_text())
|
||||||
|
self.remote_bearer_token_entry.set_text(bearer_entry.get_text())
|
||||||
|
self.remote_connection_switch.set_active(True)
|
||||||
|
self.model_manager.update_local_list()
|
||||||
|
elif response == "local":
|
||||||
|
self.ollama_instance.remote = False
|
||||||
|
self.ollama_instance.start()
|
||||||
|
self.model_manager.update_local_list()
|
||||||
|
elif response == "close":
|
||||||
|
self.destroy()
|
||||||
|
|
||||||
|
def reconnect_remote(self):
|
||||||
|
entry_url = Gtk.Entry(
|
||||||
|
css_classes = ["error"],
|
||||||
|
text = self.ollama_instance.remote_url,
|
||||||
|
placeholder_text = "URL"
|
||||||
|
)
|
||||||
|
entry_bearer_token = Gtk.Entry(
|
||||||
|
css_classes = ["error"] if self.ollama_instance.bearer_token else None,
|
||||||
|
text = self.ollama_instance.bearer_token,
|
||||||
|
placeholder_text = "Bearer Token (Optional)"
|
||||||
|
)
|
||||||
|
container = Gtk.Box(
|
||||||
|
orientation = 1,
|
||||||
|
spacing = 10
|
||||||
|
)
|
||||||
|
container.append(entry_url)
|
||||||
|
container.append(entry_bearer_token)
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Connection Error"),
|
||||||
|
body=_("The remote instance has disconnected"),
|
||||||
|
extra_child=container
|
||||||
|
)
|
||||||
|
dialog.add_response("close", _("Close Alpaca"))
|
||||||
|
if shutil.which('ollama'):
|
||||||
|
dialog.add_response("local", _("Use local instance"))
|
||||||
|
dialog.add_response("remote", _("Connect"))
|
||||||
|
dialog.set_response_appearance("remote", Adw.ResponseAppearance.SUGGESTED)
|
||||||
|
dialog.set_default_response("remote")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, url_entry=entry_url, bearer_entry=entry_bearer_token: reconnect_remote_response(self, dialog, task, url_entry, bearer_entry)
|
||||||
|
)
|
||||||
|
|
||||||
|
# CREATE MODEL | WORKS
|
||||||
|
|
||||||
|
def create_model_from_existing_response(self, dialog, task, dropdown):
|
||||||
|
model = dropdown.get_selected_item().get_string()
|
||||||
|
if dialog.choose_finish(task) == 'accept' and model:
|
||||||
|
self.create_model(model, False)
|
||||||
|
|
||||||
|
def create_model_from_existing(self):
|
||||||
|
string_list = Gtk.StringList()
|
||||||
|
for model in self.model_manager.get_model_list():
|
||||||
|
string_list.append(self.convert_model_name(model, 0))
|
||||||
|
|
||||||
|
dropdown = Gtk.DropDown()
|
||||||
|
dropdown.set_model(string_list)
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Select Model"),
|
||||||
|
body=_("This model will be used as the base for the new model"),
|
||||||
|
extra_child=dropdown
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("accept", _("Accept"))
|
||||||
|
dialog.set_response_appearance("accept", Adw.ResponseAppearance.SUGGESTED)
|
||||||
|
dialog.set_default_response("accept")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, dropdown=dropdown: create_model_from_existing_response(self, dialog, task, dropdown)
|
||||||
|
)
|
||||||
|
|
||||||
|
def create_model_from_file_response(self, file_dialog, result):
|
||||||
|
try:
|
||||||
|
file = file_dialog.open_finish(result)
|
||||||
|
try:
|
||||||
|
self.create_model(file.get_path(), True)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(e)
|
||||||
|
self.show_toast(_("An error occurred while creating the model"), self.main_overlay)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(e)
|
||||||
|
|
||||||
|
def create_model_from_file(self):
|
||||||
|
file_dialog = Gtk.FileDialog(default_filter=self.file_filter_gguf)
|
||||||
|
file_dialog.open(self, None, lambda file_dialog, result: create_model_from_file_response(self, file_dialog, result))
|
||||||
|
|
||||||
|
def create_model_from_name_response(self, dialog, task, entry):
|
||||||
|
model = entry.get_text().lower().strip()
|
||||||
|
if dialog.choose_finish(task) == 'accept' and model:
|
||||||
|
threading.Thread(target=self.model_manager.pull_model, kwargs={"model_name": model}).start()
|
||||||
|
|
||||||
|
def create_model_from_name(self):
|
||||||
|
entry = Gtk.Entry()
|
||||||
|
entry.get_delegate().connect("insert-text", lambda *_ : self.check_alphanumeric(*_, ['-', '.', ':', '_', '/']))
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Pull Model"),
|
||||||
|
body=_("Input the name of the model in this format\nname:tag"),
|
||||||
|
extra_child=entry
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("accept", _("Accept"))
|
||||||
|
dialog.set_response_appearance("accept", Adw.ResponseAppearance.SUGGESTED)
|
||||||
|
dialog.set_default_response("accept")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, entry=entry: create_model_from_name_response(self, dialog, task, entry)
|
||||||
|
)
|
||||||
|
# FILE CHOOSER | WORKS
|
||||||
|
|
||||||
|
def attach_file_response(self, file_dialog, result):
|
||||||
|
file_types = {
|
||||||
|
"plain_text": ["txt", "md", "html", "css", "js", "py", "java", "json", "xml"],
|
||||||
|
"image": ["png", "jpeg", "jpg", "webp", "gif"],
|
||||||
|
"pdf": ["pdf"]
|
||||||
|
}
|
||||||
|
try:
|
||||||
|
file = file_dialog.open_finish(result)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(e)
|
||||||
|
return
|
||||||
|
extension = file.get_path().split(".")[-1]
|
||||||
|
file_type = next(key for key, value in file_types.items() if extension in value)
|
||||||
|
if not file_type:
|
||||||
|
return
|
||||||
|
if file_type == 'image' and not self.verify_if_image_can_be_used():
|
||||||
|
self.show_toast(_("Image recognition is only available on specific models"), self.main_overlay)
|
||||||
|
return
|
||||||
|
self.attach_file(file.get_path(), file_type)
|
||||||
|
|
||||||
|
def attach_file(self, file_filter):
|
||||||
|
file_dialog = Gtk.FileDialog(default_filter=file_filter)
|
||||||
|
file_dialog.open(self, None, lambda file_dialog, result: attach_file_response(self, file_dialog, result))
|
||||||
|
|
||||||
|
# YouTube caption | WORKS
|
||||||
|
|
||||||
|
def youtube_caption_response(self, dialog, task, video_url, caption_drop_down):
|
||||||
|
if dialog.choose_finish(task) == "accept":
|
||||||
|
buffer = self.message_text_view.get_buffer()
|
||||||
|
text = buffer.get_text(buffer.get_start_iter(), buffer.get_end_iter(), False).replace(video_url, "")
|
||||||
|
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
|
||||||
|
buffer.insert(buffer.get_start_iter(), text, len(text))
|
||||||
|
|
||||||
|
yt = YouTube(video_url)
|
||||||
|
text = "{}\n{}\n{}\n\n".format(yt.title, yt.author, yt.watch_url)
|
||||||
|
selected_caption = caption_drop_down.get_selected_item().get_string()
|
||||||
|
for event in yt.captions[selected_caption.split('(')[-1][:-1]].json_captions['events']:
|
||||||
|
text += "{}\n".format(event['segs'][0]['utf8'].replace('\n', '\\n'))
|
||||||
|
if not os.path.exists(os.path.join(cache_dir, 'tmp/youtube')):
|
||||||
|
os.makedirs(os.path.join(cache_dir, 'tmp/youtube'))
|
||||||
|
file_path = os.path.join(os.path.join(cache_dir, 'tmp/youtube'), f'{yt.title} ({selected_caption.split(" (")[0]})')
|
||||||
|
with open(file_path, 'w+', encoding="utf-8") as f:
|
||||||
|
f.write(text)
|
||||||
|
self.attach_file(file_path, 'youtube')
|
||||||
|
|
||||||
|
def youtube_caption(self, video_url):
|
||||||
|
yt = YouTube(video_url)
|
||||||
|
video_title = yt.title
|
||||||
|
captions = yt.captions
|
||||||
|
if len(captions) == 0:
|
||||||
|
self.show_toast(_("This video does not have any transcriptions"), self.main_overlay)
|
||||||
|
return
|
||||||
|
caption_list = Gtk.StringList()
|
||||||
|
for caption in captions:
|
||||||
|
caption_list.append("{} ({})".format(caption.name.title(), caption.code))
|
||||||
|
caption_drop_down = Gtk.DropDown(
|
||||||
|
enable_search=len(captions) > 10,
|
||||||
|
model=caption_list
|
||||||
|
)
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Attach YouTube Video?"),
|
||||||
|
body=_("{}\n\nPlease select a transcript to include").format(video_title),
|
||||||
|
extra_child=caption_drop_down,
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("accept", _("Accept"))
|
||||||
|
dialog.set_response_appearance("accept", Adw.ResponseAppearance.SUGGESTED)
|
||||||
|
dialog.set_default_response("accept")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, video_url = video_url, caption_drop_down = caption_drop_down: youtube_caption_response(self, dialog, task, video_url, caption_drop_down)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Website extraction |
|
||||||
|
|
||||||
|
def attach_website_response(self, dialog, task, url):
|
||||||
|
if dialog.choose_finish(task) == "accept":
|
||||||
|
response = requests.get(url)
|
||||||
|
if response.status_code == 200:
|
||||||
|
html = response.text
|
||||||
|
md = html2text(html)
|
||||||
|
buffer = self.message_text_view.get_buffer()
|
||||||
|
textview_text = buffer.get_text(buffer.get_start_iter(), buffer.get_end_iter(), False).replace(url, "")
|
||||||
|
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
|
||||||
|
buffer.insert(buffer.get_start_iter(), textview_text, len(textview_text))
|
||||||
|
if not os.path.exists('/tmp/alpaca/websites/'):
|
||||||
|
os.makedirs('/tmp/alpaca/websites/')
|
||||||
|
md_name = self.generate_numbered_name('website.md', os.listdir('/tmp/alpaca/websites'))
|
||||||
|
file_path = os.path.join('/tmp/alpaca/websites/', md_name)
|
||||||
|
with open(file_path, 'w+', encoding="utf-8") as f:
|
||||||
|
f.write('{}\n\n{}'.format(url, md))
|
||||||
|
self.attach_file(file_path, 'website')
|
||||||
|
else:
|
||||||
|
self.show_toast(_("An error occurred while extracting text from the website"), self.main_overlay)
|
||||||
|
|
||||||
|
|
||||||
|
def attach_website(self, url):
|
||||||
|
dialog = Adw.AlertDialog(
|
||||||
|
heading=_("Attach Website? (Experimental)"),
|
||||||
|
body=_("Are you sure you want to attach\n'{}'?").format(url),
|
||||||
|
close_response="cancel"
|
||||||
|
)
|
||||||
|
dialog.add_response("cancel", _("Cancel"))
|
||||||
|
dialog.add_response("accept", _("Accept"))
|
||||||
|
dialog.set_response_appearance("accept", Adw.ResponseAppearance.SUGGESTED)
|
||||||
|
dialog.set_default_response("accept")
|
||||||
|
dialog.choose(
|
||||||
|
parent = self,
|
||||||
|
cancellable = None,
|
||||||
|
callback = lambda dialog, task, url=url: attach_website_response(self, dialog, task, url)
|
||||||
|
)
|
@ -1,83 +0,0 @@
|
|||||||
#generic_actions.py
|
|
||||||
"""
|
|
||||||
Working on organizing the code
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os, requests
|
|
||||||
from youtube_transcript_api import YouTubeTranscriptApi
|
|
||||||
from html2text import html2text
|
|
||||||
from .internal import cache_dir
|
|
||||||
|
|
||||||
window = None
|
|
||||||
|
|
||||||
def connect_remote(remote_url:str, bearer_token:str):
|
|
||||||
window.ollama_instance.remote_url=remote_url
|
|
||||||
window.ollama_instance.bearer_token=bearer_token
|
|
||||||
window.ollama_instance.remote = True
|
|
||||||
window.ollama_instance.stop()
|
|
||||||
window.model_manager.update_local_list()
|
|
||||||
window.save_server_config()
|
|
||||||
|
|
||||||
def attach_youtube(video_title:str, video_author:str, watch_url:str, video_url:str, video_id:str, caption_name:str):
|
|
||||||
buffer = window.message_text_view.get_buffer()
|
|
||||||
text = buffer.get_text(buffer.get_start_iter(), buffer.get_end_iter(), False).replace(video_url, "")
|
|
||||||
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
|
|
||||||
buffer.insert(buffer.get_start_iter(), text, len(text))
|
|
||||||
|
|
||||||
result_text = "{}\n{}\n{}\n\n".format(video_title, video_author, watch_url)
|
|
||||||
caption_name = caption_name.split(' (')[-1][:-1]
|
|
||||||
|
|
||||||
if caption_name.startswith('Translate:'):
|
|
||||||
available_captions = get_youtube_transcripts(video_id)
|
|
||||||
original_caption_name = available_captions[0].split(' (')[-1][:-1]
|
|
||||||
transcript = YouTubeTranscriptApi.list_transcripts(video_id).find_transcript([original_caption_name]).translate(caption_name.split(':')[-1]).fetch()
|
|
||||||
result_text += '(Auto translated from {})\n'.format(available_captions[0])
|
|
||||||
else:
|
|
||||||
transcript = YouTubeTranscriptApi.get_transcript(video_id, languages=[caption_name])
|
|
||||||
|
|
||||||
result_text += '\n'.join([t['text'] for t in transcript])
|
|
||||||
|
|
||||||
if not os.path.exists(os.path.join(cache_dir, 'tmp/youtube')):
|
|
||||||
os.makedirs(os.path.join(cache_dir, 'tmp/youtube'))
|
|
||||||
file_path = os.path.join(os.path.join(cache_dir, 'tmp/youtube'), '{} ({})'.format(video_title.replace('/', ' '), caption_name))
|
|
||||||
with open(file_path, 'w+', encoding="utf-8") as f:
|
|
||||||
f.write(result_text)
|
|
||||||
|
|
||||||
window.attach_file(file_path, 'youtube')
|
|
||||||
|
|
||||||
def get_youtube_transcripts(video_id:str):
|
|
||||||
return ['{} ({})'.format(t.language, t.language_code) for t in YouTubeTranscriptApi.list_transcripts(video_id)]
|
|
||||||
|
|
||||||
def attach_website(url:str):
|
|
||||||
response = requests.get(url)
|
|
||||||
if response.status_code == 200:
|
|
||||||
html = response.text
|
|
||||||
md = html2text(html)
|
|
||||||
buffer = window.message_text_view.get_buffer()
|
|
||||||
textview_text = buffer.get_text(buffer.get_start_iter(), buffer.get_end_iter(), False).replace(url, "")
|
|
||||||
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
|
|
||||||
buffer.insert(buffer.get_start_iter(), textview_text, len(textview_text))
|
|
||||||
if not os.path.exists('/tmp/alpaca/websites/'):
|
|
||||||
os.makedirs('/tmp/alpaca/websites/')
|
|
||||||
md_name = window.generate_numbered_name('website.md', os.listdir('/tmp/alpaca/websites'))
|
|
||||||
file_path = os.path.join('/tmp/alpaca/websites/', md_name)
|
|
||||||
with open(file_path, 'w+', encoding="utf-8") as f:
|
|
||||||
f.write('{}\n\n{}'.format(url, md))
|
|
||||||
window.attach_file(file_path, 'website')
|
|
||||||
else:
|
|
||||||
window.show_toast(_("An error occurred while extracting text from the website"), window.main_overlay)
|
|
||||||
|
|
||||||
def attach_file(file):
|
|
||||||
file_types = {
|
|
||||||
"plain_text": ["txt", "md", "html", "css", "js", "py", "java", "json", "xml"],
|
|
||||||
"image": ["png", "jpeg", "jpg", "webp", "gif"],
|
|
||||||
"pdf": ["pdf"]
|
|
||||||
}
|
|
||||||
extension = file.get_path().split(".")[-1]
|
|
||||||
file_type = next(key for key, value in file_types.items() if extension in value)
|
|
||||||
if not file_type:
|
|
||||||
return
|
|
||||||
if file_type == 'image' and not window.model_manager.verify_if_image_can_be_used():
|
|
||||||
window.show_toast(_("Image recognition is only available on specific models"), window.main_overlay)
|
|
||||||
return
|
|
||||||
window.attach_file(file.get_path(), file_type)
|
|
@ -1,2 +0,0 @@
|
|||||||
<?xml version="1.0" encoding="UTF-8"?>
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 3 2 c -0.265625 0 -0.519531 0.105469 -0.707031 0.292969 c -0.390625 0.390625 -0.390625 1.023437 0 1.414062 l 4.292969 4.292969 l -4.292969 4.292969 c -0.390625 0.390625 -0.390625 1.023437 0 1.414062 s 1.023437 0.390625 1.414062 0 l 4.292969 -4.292969 l 4.292969 4.292969 c 0.390625 0.390625 1.023437 0.390625 1.414062 0 s 0.390625 -1.023437 0 -1.414062 l -4.292969 -4.292969 l 4.292969 -4.292969 c 0.390625 -0.390625 0.390625 -1.023437 0 -1.414062 c -0.1875 -0.1875 -0.441406 -0.292969 -0.707031 -0.292969 s -0.519531 0.105469 -0.707031 0.292969 l -4.292969 4.292969 l -4.292969 -4.292969 c -0.1875 -0.1875 -0.441406 -0.292969 -0.707031 -0.292969 z m 0 0" fill="#222222"/></svg>
|
|
Before Width: | Height: | Size: 816 B |
@ -1,2 +0,0 @@
|
|||||||
<?xml version="1.0" encoding="UTF-8"?>
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 4.992188 2.996094 v 10 h 1 c 0.175781 0 0.347656 -0.039063 0.5 -0.125 l 7 -4 c 0.308593 -0.171875 0.46875 -0.523438 0.46875 -0.875 c 0 -0.351563 -0.160157 -0.703125 -0.46875 -0.875 l -7 -4 c -0.152344 -0.085938 -0.324219 -0.125 -0.5 -0.125 z m 0 0" fill="#222222"/></svg>
|
|
Before Width: | Height: | Size: 409 B |
@ -1,2 +0,0 @@
|
|||||||
<?xml version="1.0" encoding="UTF-8"?>
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 8 0 c -4.410156 0 -8 3.589844 -8 8 s 3.589844 8 8 8 s 8 -3.589844 8 -8 s -3.589844 -8 -8 -8 z m 0 2 c 3.332031 0 6 2.667969 6 6 s -2.667969 6 -6 6 s -6 -2.667969 -6 -6 s 2.667969 -6 6 -6 z m 0 1.875 c -0.621094 0 -1.125 0.503906 -1.125 1.125 s 0.503906 1.125 1.125 1.125 s 1.125 -0.503906 1.125 -1.125 s -0.503906 -1.125 -1.125 -1.125 z m -1.523438 3.125 c -0.265624 0.011719 -0.476562 0.230469 -0.476562 0.5 c 0 0.277344 0.222656 0.5 0.5 0.5 h 0.5 v 3 h -0.5 c -0.277344 0 -0.5 0.222656 -0.5 0.5 s 0.222656 0.5 0.5 0.5 h 3 c 0.277344 0 0.5 -0.222656 0.5 -0.5 s -0.222656 -0.5 -0.5 -0.5 h -0.5 v -4 h -2.5 c -0.007812 0 -0.015625 0 -0.023438 0 z m 0 0" fill="#222222"/></svg>
|
|
Before Width: | Height: | Size: 813 B |
@ -40,7 +40,6 @@ translators = [
|
|||||||
'Louis Chauvet-Villaret (French) https://github.com/loulou64490',
|
'Louis Chauvet-Villaret (French) https://github.com/loulou64490',
|
||||||
'Théo FORTIN (French) https://github.com/topiga',
|
'Théo FORTIN (French) https://github.com/topiga',
|
||||||
'Daimar Stein (Brazilian Portuguese) https://github.com/not-a-dev-stein',
|
'Daimar Stein (Brazilian Portuguese) https://github.com/not-a-dev-stein',
|
||||||
'Bruno Antunes (Brazilian Portuguese) https://github.com/antun3s',
|
|
||||||
'CounterFlow64 (Norwegian) https://github.com/CounterFlow64',
|
'CounterFlow64 (Norwegian) https://github.com/CounterFlow64',
|
||||||
'Aritra Saha (Bengali) https://github.com/olumolu',
|
'Aritra Saha (Bengali) https://github.com/olumolu',
|
||||||
'Yuehao Sui (Simplified Chinese) https://github.com/8ar10der',
|
'Yuehao Sui (Simplified Chinese) https://github.com/8ar10der',
|
||||||
@ -57,8 +56,7 @@ class AlpacaApplication(Adw.Application):
|
|||||||
def __init__(self, version):
|
def __init__(self, version):
|
||||||
super().__init__(application_id='com.jeffser.Alpaca',
|
super().__init__(application_id='com.jeffser.Alpaca',
|
||||||
flags=Gio.ApplicationFlags.DEFAULT_FLAGS)
|
flags=Gio.ApplicationFlags.DEFAULT_FLAGS)
|
||||||
self.create_action('quit', lambda *_: self.props.active_window.closing_app(None), ['<primary>q'])
|
self.create_action('quit', lambda *_: self.props.active_window.closing_app(None), ['<primary>w', '<primary>q'])
|
||||||
self.set_accels_for_action('app.delete_current_chat', ['<primary>w'])
|
|
||||||
self.create_action('preferences', lambda *_: self.props.active_window.preferences_dialog.present(self.props.active_window), ['<primary>comma'])
|
self.create_action('preferences', lambda *_: self.props.active_window.preferences_dialog.present(self.props.active_window), ['<primary>comma'])
|
||||||
self.create_action('about', self.on_about_action)
|
self.create_action('about', self.on_about_action)
|
||||||
self.set_accels_for_action("win.show-help-overlay", ['<primary>slash'])
|
self.set_accels_for_action("win.show-help-overlay", ['<primary>slash'])
|
||||||
|
@ -40,19 +40,17 @@ alpaca_sources = [
|
|||||||
'main.py',
|
'main.py',
|
||||||
'window.py',
|
'window.py',
|
||||||
'connection_handler.py',
|
'connection_handler.py',
|
||||||
|
'dialogs.py',
|
||||||
'available_models.json',
|
'available_models.json',
|
||||||
'available_models_descriptions.py',
|
'available_models_descriptions.py',
|
||||||
'internal.py',
|
'internal.py'
|
||||||
'generic_actions.py'
|
|
||||||
]
|
]
|
||||||
|
|
||||||
custom_widgets = [
|
custom_widgets = [
|
||||||
'custom_widgets/table_widget.py',
|
'custom_widgets/table_widget.py',
|
||||||
'custom_widgets/message_widget.py',
|
'custom_widgets/message_widget.py',
|
||||||
'custom_widgets/chat_widget.py',
|
'custom_widgets/chat_widget.py',
|
||||||
'custom_widgets/model_widget.py',
|
'custom_widgets/model_widget.py'
|
||||||
'custom_widgets/terminal_widget.py',
|
|
||||||
'custom_widgets/dialog_widget.py'
|
|
||||||
]
|
]
|
||||||
|
|
||||||
install_data(alpaca_sources, install_dir: moduledir)
|
install_data(alpaca_sources, install_dir: moduledir)
|
||||||
|
@ -4,9 +4,6 @@
|
|||||||
.chat_image_button {
|
.chat_image_button {
|
||||||
padding: 0;
|
padding: 0;
|
||||||
}
|
}
|
||||||
.chat_image_button, .chat_image_button image {
|
|
||||||
border-radius: 10px;
|
|
||||||
}
|
|
||||||
.editing_message_textview {
|
.editing_message_textview {
|
||||||
border-radius: 5px;
|
border-radius: 5px;
|
||||||
padding: 5px;
|
padding: 5px;
|
||||||
@ -39,7 +36,4 @@ stacksidebar {
|
|||||||
}
|
}
|
||||||
.code_block {
|
.code_block {
|
||||||
font-family: monospace;
|
font-family: monospace;
|
||||||
}
|
}
|
||||||
.terminal {
|
|
||||||
padding: 10px;
|
|
||||||
}
|
|
325
src/window.py
325
src/window.py
@ -24,7 +24,6 @@ from io import BytesIO
|
|||||||
from PIL import Image
|
from PIL import Image
|
||||||
from pypdf import PdfReader
|
from pypdf import PdfReader
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pytube import YouTube
|
|
||||||
|
|
||||||
import gi
|
import gi
|
||||||
gi.require_version('GtkSource', '5')
|
gi.require_version('GtkSource', '5')
|
||||||
@ -32,8 +31,8 @@ gi.require_version('GdkPixbuf', '2.0')
|
|||||||
|
|
||||||
from gi.repository import Adw, Gtk, Gdk, GLib, GtkSource, Gio, GdkPixbuf
|
from gi.repository import Adw, Gtk, Gdk, GLib, GtkSource, Gio, GdkPixbuf
|
||||||
|
|
||||||
from . import connection_handler, generic_actions
|
from . import dialogs, connection_handler
|
||||||
from .custom_widgets import message_widget, chat_widget, model_widget, terminal_widget, dialog_widget
|
from .custom_widgets import message_widget, chat_widget, model_widget
|
||||||
from .internal import config_dir, data_dir, cache_dir, source_dir
|
from .internal import config_dir, data_dir, cache_dir, source_dir
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@ -52,10 +51,10 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
|
|
||||||
#Variables
|
#Variables
|
||||||
attachments = {}
|
attachments = {}
|
||||||
|
header_bar = Gtk.Template.Child()
|
||||||
|
|
||||||
#Override elements
|
#Override elements
|
||||||
overrides_group = Gtk.Template.Child()
|
overrides_group = Gtk.Template.Child()
|
||||||
instance_page = Gtk.Template.Child()
|
|
||||||
|
|
||||||
#Elements
|
#Elements
|
||||||
split_view_overlay = Gtk.Template.Child()
|
split_view_overlay = Gtk.Template.Child()
|
||||||
@ -69,7 +68,7 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
preferences_dialog = Gtk.Template.Child()
|
preferences_dialog = Gtk.Template.Child()
|
||||||
shortcut_window : Gtk.ShortcutsWindow = Gtk.Template.Child()
|
shortcut_window : Gtk.ShortcutsWindow = Gtk.Template.Child()
|
||||||
file_preview_dialog = Gtk.Template.Child()
|
file_preview_dialog = Gtk.Template.Child()
|
||||||
file_preview_text_label = Gtk.Template.Child()
|
file_preview_text_view = Gtk.Template.Child()
|
||||||
file_preview_image = Gtk.Template.Child()
|
file_preview_image = Gtk.Template.Child()
|
||||||
welcome_dialog = Gtk.Template.Child()
|
welcome_dialog = Gtk.Template.Child()
|
||||||
welcome_carousel = Gtk.Template.Child()
|
welcome_carousel = Gtk.Template.Child()
|
||||||
@ -94,17 +93,13 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
file_preview_remove_button = Gtk.Template.Child()
|
file_preview_remove_button = Gtk.Template.Child()
|
||||||
secondary_menu_button = Gtk.Template.Child()
|
secondary_menu_button = Gtk.Template.Child()
|
||||||
model_searchbar = Gtk.Template.Child()
|
model_searchbar = Gtk.Template.Child()
|
||||||
message_searchbar = Gtk.Template.Child()
|
|
||||||
message_search_button = Gtk.Template.Child()
|
|
||||||
searchentry_messages = Gtk.Template.Child()
|
|
||||||
no_results_page = Gtk.Template.Child()
|
no_results_page = Gtk.Template.Child()
|
||||||
model_link_button = Gtk.Template.Child()
|
model_link_button = Gtk.Template.Child()
|
||||||
title_stack = Gtk.Template.Child()
|
launch_dialog = Gtk.Template.Child()
|
||||||
|
launch_status = Gtk.Template.Child()
|
||||||
|
launch_level_bar = Gtk.Template.Child()
|
||||||
manage_models_dialog = Gtk.Template.Child()
|
manage_models_dialog = Gtk.Template.Child()
|
||||||
model_scroller = Gtk.Template.Child()
|
model_scroller = Gtk.Template.Child()
|
||||||
model_detail_page = Gtk.Template.Child()
|
|
||||||
model_detail_create_button = Gtk.Template.Child()
|
|
||||||
ollama_information_label = Gtk.Template.Child()
|
|
||||||
|
|
||||||
chat_list_container = Gtk.Template.Child()
|
chat_list_container = Gtk.Template.Child()
|
||||||
chat_list_box = None
|
chat_list_box = None
|
||||||
@ -116,14 +111,13 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
background_switch = Gtk.Template.Child()
|
background_switch = Gtk.Template.Child()
|
||||||
powersaver_warning_switch = Gtk.Template.Child()
|
powersaver_warning_switch = Gtk.Template.Child()
|
||||||
remote_connection_switch = Gtk.Template.Child()
|
remote_connection_switch = Gtk.Template.Child()
|
||||||
|
remote_connection_entry = Gtk.Template.Child()
|
||||||
|
remote_bearer_token_entry = Gtk.Template.Child()
|
||||||
|
|
||||||
banner = Gtk.Template.Child()
|
banner = Gtk.Template.Child()
|
||||||
|
|
||||||
style_manager = Adw.StyleManager()
|
style_manager = Adw.StyleManager()
|
||||||
|
|
||||||
terminal_scroller = Gtk.Template.Child()
|
|
||||||
terminal_dialog = Gtk.Template.Child()
|
|
||||||
|
|
||||||
@Gtk.Template.Callback()
|
@Gtk.Template.Callback()
|
||||||
def stop_message(self, button=None):
|
def stop_message(self, button=None):
|
||||||
self.chat_list_box.get_current_chat().stop_message()
|
self.chat_list_box.get_current_chat().stop_message()
|
||||||
@ -212,8 +206,51 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
self.welcome_carousel.scroll_to(self.welcome_carousel.get_nth_page(self.welcome_carousel.get_position()+1), True)
|
self.welcome_carousel.scroll_to(self.welcome_carousel.get_nth_page(self.welcome_carousel.get_position()+1), True)
|
||||||
else:
|
else:
|
||||||
self.welcome_dialog.force_close()
|
self.welcome_dialog.force_close()
|
||||||
|
if shutil.which('ollama'):
|
||||||
|
threading.Thread(target=self.prepare_alpaca, args=(11435, '', False, {'temperature': 0.7, 'seed': 0, 'keep_alive': 5}, {}, '', 0, True, True)).start()
|
||||||
|
else:
|
||||||
|
threading.Thread(target=self.prepare_alpaca, args=(11435, 'http://0.0.0.0:11434', True, {'temperature': 0.7, 'seed': 0, 'keep_alive': 5}, {}, '', 0, True, False)).start()
|
||||||
self.powersaver_warning_switch.set_active(True)
|
self.powersaver_warning_switch.set_active(True)
|
||||||
|
|
||||||
|
@Gtk.Template.Callback()
|
||||||
|
def change_remote_connection(self, switcher, *_):
|
||||||
|
logger.debug("Connection switched")
|
||||||
|
if self.remote_connection_switch.get_active() and not self.remote_connection_entry.get_text():
|
||||||
|
self.remote_connection_switch.set_active(False)
|
||||||
|
return
|
||||||
|
self.ollama_instance.remote = self.remote_connection_switch.get_active()
|
||||||
|
if self.ollama_instance.remote:
|
||||||
|
self.ollama_instance.stop()
|
||||||
|
else:
|
||||||
|
self.ollama_instance.start()
|
||||||
|
if self.model_manager:
|
||||||
|
self.model_manager.update_local_list()
|
||||||
|
self.save_server_config()
|
||||||
|
|
||||||
|
@Gtk.Template.Callback()
|
||||||
|
def change_remote_url(self, entry):
|
||||||
|
if entry.get_text() and not entry.get_text().startswith("http"):
|
||||||
|
entry.set_text("http://{}".format(entry.get_text()))
|
||||||
|
return
|
||||||
|
if entry.get_text() and entry.get_text() != entry.get_text().rstrip('/'):
|
||||||
|
entry.set_text(entry.get_text().rstrip('/'))
|
||||||
|
return
|
||||||
|
self.remote_connection_switch.set_sensitive(entry.get_text())
|
||||||
|
logger.debug(f"Changing remote url: {self.ollama_instance.remote_url}")
|
||||||
|
self.ollama_instance.remote_url = entry.get_text()
|
||||||
|
if not entry.get_text():
|
||||||
|
self.remote_connection_switch.set_active(False)
|
||||||
|
if self.ollama_instance.remote and self.model_manager and entry.get_text():
|
||||||
|
self.model_manager.update_local_list()
|
||||||
|
self.save_server_config()
|
||||||
|
|
||||||
|
@Gtk.Template.Callback()
|
||||||
|
def change_remote_bearer_token(self, entry):
|
||||||
|
self.ollama_instance.bearer_token = entry.get_text()
|
||||||
|
if self.ollama_instance.remote_url and self.ollama_instance.remote and self.model_manager:
|
||||||
|
self.model_manager.update_local_list()
|
||||||
|
self.save_server_config()
|
||||||
|
|
||||||
@Gtk.Template.Callback()
|
@Gtk.Template.Callback()
|
||||||
def switch_run_on_background(self, switch, user_data):
|
def switch_run_on_background(self, switch, user_data):
|
||||||
logger.debug("Switching run on background")
|
logger.debug("Switching run on background")
|
||||||
@ -291,10 +328,6 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
self.model_manager.pulling_list.set_visible(not button.get_active() and len(list(self.model_manager.pulling_list)) > 0)
|
self.model_manager.pulling_list.set_visible(not button.get_active() and len(list(self.model_manager.pulling_list)) > 0)
|
||||||
self.model_manager.local_list.set_visible(not button.get_active() and len(list(self.model_manager.local_list)) > 0)
|
self.model_manager.local_list.set_visible(not button.get_active() and len(list(self.model_manager.local_list)) > 0)
|
||||||
|
|
||||||
@Gtk.Template.Callback()
|
|
||||||
def message_search_toggle(self, button):
|
|
||||||
self.message_searchbar.set_search_mode(button.get_active())
|
|
||||||
|
|
||||||
@Gtk.Template.Callback()
|
@Gtk.Template.Callback()
|
||||||
def model_search_changed(self, entry):
|
def model_search_changed(self, entry):
|
||||||
results = 0
|
results = 0
|
||||||
@ -310,24 +343,6 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
self.model_scroller.set_visible(True)
|
self.model_scroller.set_visible(True)
|
||||||
self.no_results_page.set_visible(False)
|
self.no_results_page.set_visible(False)
|
||||||
|
|
||||||
@Gtk.Template.Callback()
|
|
||||||
def message_search_changed(self, entry, current_chat=None):
|
|
||||||
search_term=entry.get_text()
|
|
||||||
results = 0
|
|
||||||
if not current_chat:
|
|
||||||
current_chat = self.chat_list_box.get_current_chat()
|
|
||||||
if current_chat:
|
|
||||||
for key, message in current_chat.messages.items():
|
|
||||||
if message and message.text:
|
|
||||||
message.set_visible(re.search(search_term, message.text, re.IGNORECASE))
|
|
||||||
for block in message.content_children:
|
|
||||||
if isinstance(block, message_widget.text_block):
|
|
||||||
if search_term:
|
|
||||||
highlighted_text = re.sub(f"({re.escape(search_term)})", r"<span background='yellow' bgalpha='30%'>\1</span>", block.get_text(),flags=re.IGNORECASE)
|
|
||||||
block.set_markup(highlighted_text)
|
|
||||||
else:
|
|
||||||
block.set_markup(block.get_text())
|
|
||||||
|
|
||||||
@Gtk.Template.Callback()
|
@Gtk.Template.Callback()
|
||||||
def on_clipboard_paste(self, textview):
|
def on_clipboard_paste(self, textview):
|
||||||
logger.debug("Pasting from clipboard")
|
logger.debug("Pasting from clipboard")
|
||||||
@ -335,10 +350,6 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
clipboard.read_text_async(None, self.cb_text_received)
|
clipboard.read_text_async(None, self.cb_text_received)
|
||||||
clipboard.read_texture_async(None, self.cb_image_received)
|
clipboard.read_texture_async(None, self.cb_image_received)
|
||||||
|
|
||||||
@Gtk.Template.Callback()
|
|
||||||
def model_detail_create_button_clicked(self, button):
|
|
||||||
self.create_model(button.get_name(), False)
|
|
||||||
|
|
||||||
def convert_model_name(self, name:str, mode:int) -> str: # mode=0 name:tag -> Name (tag) | mode=1 Name (tag) -> name:tag
|
def convert_model_name(self, name:str, mode:int) -> str: # mode=0 name:tag -> Name (tag) | mode=1 Name (tag) -> name:tag
|
||||||
try:
|
try:
|
||||||
if mode == 0:
|
if mode == 0:
|
||||||
@ -358,15 +369,20 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
modelfile_buffer.delete(modelfile_buffer.get_start_iter(), modelfile_buffer.get_end_iter())
|
modelfile_buffer.delete(modelfile_buffer.get_start_iter(), modelfile_buffer.get_end_iter())
|
||||||
self.create_model_system.set_text('')
|
self.create_model_system.set_text('')
|
||||||
if not file:
|
if not file:
|
||||||
data = next((element for element in list(self.model_manager.model_selector.get_popover().model_list_box) if element.get_name() == self.convert_model_name(model, 1)), None).data
|
response = self.ollama_instance.request("POST", "api/show", json.dumps({"name": self.convert_model_name(model, 1)}))
|
||||||
modelfile = []
|
if response.status_code == 200:
|
||||||
for line in data['modelfile'].split('\n'):
|
data = json.loads(response.text)
|
||||||
if line.startswith('SYSTEM'):
|
modelfile = []
|
||||||
self.create_model_system.set_text(line[len('SYSTEM'):].strip())
|
for line in data['modelfile'].split('\n'):
|
||||||
if not line.startswith('SYSTEM') and not line.startswith('FROM') and not line.startswith('#'):
|
if line.startswith('SYSTEM'):
|
||||||
modelfile.append(line)
|
self.create_model_system.set_text(line[len('SYSTEM'):].strip())
|
||||||
self.create_model_name.set_text(self.convert_model_name(model, 1).split(':')[0] + "-custom")
|
if not line.startswith('SYSTEM') and not line.startswith('FROM') and not line.startswith('#'):
|
||||||
modelfile_buffer.insert(modelfile_buffer.get_start_iter(), '\n'.join(modelfile), len('\n'.join(modelfile).encode('utf-8')))
|
modelfile.append(line)
|
||||||
|
self.create_model_name.set_text(self.convert_model_name(model, 1).split(':')[0] + "-custom")
|
||||||
|
modelfile_buffer.insert(modelfile_buffer.get_start_iter(), '\n'.join(modelfile), len('\n'.join(modelfile).encode('utf-8')))
|
||||||
|
else:
|
||||||
|
##TODO ERROR MESSAGE
|
||||||
|
return
|
||||||
self.create_model_base.set_subtitle(self.convert_model_name(model, 1))
|
self.create_model_base.set_subtitle(self.convert_model_name(model, 1))
|
||||||
else:
|
else:
|
||||||
self.create_model_name.set_text(os.path.splitext(os.path.basename(model))[0])
|
self.create_model_name.set_text(os.path.splitext(os.path.basename(model))[0])
|
||||||
@ -405,7 +421,7 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
if content:
|
if content:
|
||||||
if file_type == 'image':
|
if file_type == 'image':
|
||||||
self.file_preview_image.set_visible(True)
|
self.file_preview_image.set_visible(True)
|
||||||
self.file_preview_text_label.set_visible(False)
|
self.file_preview_text_view.set_visible(False)
|
||||||
image_data = base64.b64decode(content)
|
image_data = base64.b64decode(content)
|
||||||
loader = GdkPixbuf.PixbufLoader.new()
|
loader = GdkPixbuf.PixbufLoader.new()
|
||||||
loader.write(image_data)
|
loader.write(image_data)
|
||||||
@ -418,8 +434,10 @@ class AlpacaWindow(Adw.ApplicationWindow):
|
|||||||
self.file_preview_open_button.set_name(file_path)
|
self.file_preview_open_button.set_name(file_path)
|
||||||
else:
|
else:
|
||||||
self.file_preview_image.set_visible(False)
|
self.file_preview_image.set_visible(False)
|
||||||
self.file_preview_text_label.set_visible(True)
|
self.file_preview_text_view.set_visible(True)
|
||||||
buffer = self.file_preview_text_label.set_label(content)
|
buffer = self.file_preview_text_view.get_buffer()
|
||||||
|
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
|
||||||
|
buffer.insert(buffer.get_start_iter(), content, len(content.encode('utf-8')))
|
||||||
if file_type == 'youtube':
|
if file_type == 'youtube':
|
||||||
self.file_preview_dialog.set_title(content.split('\n')[0])
|
self.file_preview_dialog.set_title(content.split('\n')[0])
|
||||||
self.file_preview_open_button.set_name(content.split('\n')[2])
|
self.file_preview_open_button.set_name(content.split('\n')[2])
|
||||||
@ -536,17 +554,12 @@ Generate a title following these rules:
|
|||||||
if self.regenerate_button:
|
if self.regenerate_button:
|
||||||
GLib.idle_add(self.chat_list_box.get_current_chat().remove, self.regenerate_button)
|
GLib.idle_add(self.chat_list_box.get_current_chat().remove, self.regenerate_button)
|
||||||
try:
|
try:
|
||||||
response = self.ollama_instance.request("POST", "api/chat", json.dumps(data), lambda data, message_element=message_element: message_element.update_message(data))
|
response = self.ollama_instance.request("POST", "api/chat", json.dumps(data), lambda data, message_element=message_element: GLib.idle_add(message_element.update_message, data))
|
||||||
if response.status_code != 200:
|
if response.status_code != 200:
|
||||||
raise Exception('Network Error')
|
raise Exception('Network Error')
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(e)
|
|
||||||
self.chat_list_box.get_tab_by_name(chat.get_name()).spinner.set_visible(False)
|
|
||||||
chat.busy = False
|
chat.busy = False
|
||||||
GLib.idle_add(message_element.add_action_buttons)
|
GLib.idle_add(message_element.add_action_buttons)
|
||||||
if message_element.spinner:
|
|
||||||
GLib.idle_add(message_element.container.remove, message_element.spinner)
|
|
||||||
message_element.spinner = None
|
|
||||||
GLib.idle_add(chat.show_regenerate_button, message_element)
|
GLib.idle_add(chat.show_regenerate_button, message_element)
|
||||||
GLib.idle_add(self.connection_error)
|
GLib.idle_add(self.connection_error)
|
||||||
|
|
||||||
@ -607,7 +620,6 @@ Generate a title following these rules:
|
|||||||
self.chat_list_box.prepend_chat(_("New Chat"))
|
self.chat_list_box.prepend_chat(_("New Chat"))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def generate_numbered_name(self, chat_name:str, compare_list:list) -> str:
|
def generate_numbered_name(self, chat_name:str, compare_list:list) -> str:
|
||||||
if chat_name in compare_list:
|
if chat_name in compare_list:
|
||||||
for i in range(len(compare_list)):
|
for i in range(len(compare_list)):
|
||||||
@ -627,16 +639,7 @@ Generate a title following these rules:
|
|||||||
def connection_error(self):
|
def connection_error(self):
|
||||||
logger.error("Connection error")
|
logger.error("Connection error")
|
||||||
if self.ollama_instance.remote:
|
if self.ollama_instance.remote:
|
||||||
options = {
|
dialogs.reconnect_remote(self)
|
||||||
_("Close Alpaca"): {"callback": lambda *_: self.get_application().quit(), "appearance": "destructive"},
|
|
||||||
_("Use Local Instance"): {"callback": lambda *_: self.remote_connection_switch.set_active(False)},
|
|
||||||
_("Connect"): {"callback": lambda url, bearer: generic_actions.connect_remote(url,bearer), "appearance": "suggested"}
|
|
||||||
}
|
|
||||||
entries = [
|
|
||||||
{"text": self.ollama_instance.remote_url, "css": ['error'], "placeholder": _('Server URL')},
|
|
||||||
{"text": self.ollama_instance.bearer_token, "css": ['error'] if self.ollama_instance.bearer_token else None, "placeholder": _('Bearer Token (Optional)')}
|
|
||||||
]
|
|
||||||
dialog_widget.Entry(_('Connection Error'), _('The remote instance has disconnected'), list(options)[0], options, entries)
|
|
||||||
else:
|
else:
|
||||||
self.ollama_instance.reset()
|
self.ollama_instance.reset()
|
||||||
self.show_toast(_("There was an error with the local Ollama instance, so it has been reset"), self.main_overlay)
|
self.show_toast(_("There was an error with the local Ollama instance, so it has been reset"), self.main_overlay)
|
||||||
@ -681,8 +684,6 @@ Generate a title following these rules:
|
|||||||
del self.attachments[name]
|
del self.attachments[name]
|
||||||
if len(self.attachments) == 0:
|
if len(self.attachments) == 0:
|
||||||
self.attachment_box.set_visible(False)
|
self.attachment_box.set_visible(False)
|
||||||
if self.file_preview_dialog.get_visible():
|
|
||||||
self.file_preview_dialog.close()
|
|
||||||
|
|
||||||
def attach_file(self, file_path, file_type):
|
def attach_file(self, file_path, file_type):
|
||||||
logger.debug(f"Attaching file: {file_path}")
|
logger.debug(f"Attaching file: {file_path}")
|
||||||
@ -708,6 +709,7 @@ Generate a title following these rules:
|
|||||||
child=button_content
|
child=button_content
|
||||||
)
|
)
|
||||||
self.attachments[file_name] = {"path": file_path, "type": file_type, "content": content, "button": button}
|
self.attachments[file_name] = {"path": file_path, "type": file_type, "content": content, "button": button}
|
||||||
|
#button.connect("clicked", lambda button: dialogs.remove_attached_file(self, button))
|
||||||
button.connect("clicked", lambda button : self.preview_file(file_path, file_type, file_name))
|
button.connect("clicked", lambda button : self.preview_file(file_path, file_type, file_name))
|
||||||
self.attachment_container.append(button)
|
self.attachment_container.append(button)
|
||||||
self.attachment_box.set_visible(True)
|
self.attachment_box.set_visible(True)
|
||||||
@ -717,23 +719,11 @@ Generate a title following these rules:
|
|||||||
chat_name = chat_row.label.get_label()
|
chat_name = chat_row.label.get_label()
|
||||||
action_name = action.get_name()
|
action_name = action.get_name()
|
||||||
if action_name in ('delete_chat', 'delete_current_chat'):
|
if action_name in ('delete_chat', 'delete_current_chat'):
|
||||||
dialog_widget.simple(
|
dialogs.delete_chat(self, chat_name)
|
||||||
_('Delete Chat?'),
|
|
||||||
_("Are you sure you want to delete '{}'?").format(chat_name),
|
|
||||||
lambda chat_name=chat_name, *_: self.chat_list_box.delete_chat(chat_name),
|
|
||||||
_('Delete'),
|
|
||||||
'destructive'
|
|
||||||
)
|
|
||||||
elif action_name in ('duplicate_chat', 'duplicate_current_chat'):
|
elif action_name in ('duplicate_chat', 'duplicate_current_chat'):
|
||||||
self.chat_list_box.duplicate_chat(chat_name)
|
self.chat_list_box.duplicate_chat(chat_name)
|
||||||
elif action_name in ('rename_chat', 'rename_current_chat'):
|
elif action_name in ('rename_chat', 'rename_current_chat'):
|
||||||
dialog_widget.simple_entry(
|
dialogs.rename_chat(self, chat_name)
|
||||||
_('Rename Chat?'),
|
|
||||||
_("Renaming '{}'").format(chat_name),
|
|
||||||
lambda new_chat_name, old_chat_name=chat_name, *_: self.chat_list_box.rename_chat(old_chat_name, new_chat_name),
|
|
||||||
{'placeholder': _('Chat name')},
|
|
||||||
_('Rename')
|
|
||||||
)
|
|
||||||
elif action_name in ('export_chat', 'export_current_chat'):
|
elif action_name in ('export_chat', 'export_current_chat'):
|
||||||
self.chat_list_box.export_chat(chat_name)
|
self.chat_list_box.export_chat(chat_name)
|
||||||
|
|
||||||
@ -741,36 +731,6 @@ Generate a title following these rules:
|
|||||||
self.selected_chat_row = self.chat_list_box.get_selected_row()
|
self.selected_chat_row = self.chat_list_box.get_selected_row()
|
||||||
self.chat_actions(action, user_data)
|
self.chat_actions(action, user_data)
|
||||||
|
|
||||||
def youtube_detected(self, video_url):
|
|
||||||
try:
|
|
||||||
tries=0
|
|
||||||
while True:
|
|
||||||
try:
|
|
||||||
yt = YouTube(video_url)
|
|
||||||
video_title = yt.title
|
|
||||||
break
|
|
||||||
except Exception as e:
|
|
||||||
tries+=1
|
|
||||||
if tries == 4:
|
|
||||||
raise Exception(e)
|
|
||||||
transcriptions = generic_actions.get_youtube_transcripts(yt.video_id)
|
|
||||||
if len(transcriptions) == 0:
|
|
||||||
self.show_toast(_("This video does not have any transcriptions"), self.main_overlay)
|
|
||||||
return
|
|
||||||
|
|
||||||
if not any(filter(lambda x: '(en' in x and 'auto-generated' not in x and len(transcriptions) > 1, transcriptions)):
|
|
||||||
transcriptions.insert(1, 'English (translate:en)')
|
|
||||||
|
|
||||||
dialog_widget.simple_dropdown(
|
|
||||||
_('Attach YouTube Video?'),
|
|
||||||
_('{}\n\nPlease select a transcript to include').format(video_title),
|
|
||||||
lambda caption_name, yt=yt, video_url=video_url: generic_actions.attach_youtube(yt.title, yt.author, yt.watch_url, video_url, yt.video_id, caption_name),
|
|
||||||
transcriptions
|
|
||||||
)
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(e)
|
|
||||||
self.show_toast(_("Error attaching video, please try again"), self.main_overlay)
|
|
||||||
|
|
||||||
def cb_text_received(self, clipboard, result):
|
def cb_text_received(self, clipboard, result):
|
||||||
try:
|
try:
|
||||||
text = clipboard.read_text_finish(result)
|
text = clipboard.read_text_finish(result)
|
||||||
@ -786,13 +746,13 @@ Generate a title following these rules:
|
|||||||
r'(?:/[^\\s]*)?'
|
r'(?:/[^\\s]*)?'
|
||||||
)
|
)
|
||||||
if youtube_regex.match(text):
|
if youtube_regex.match(text):
|
||||||
self.youtube_detected(text)
|
try:
|
||||||
|
dialogs.youtube_caption(self, text)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(e)
|
||||||
|
self.show_toast(_("This video is not available"), self.main_overlay)
|
||||||
elif url_regex.match(text):
|
elif url_regex.match(text):
|
||||||
dialog_widget.simple(
|
dialogs.attach_website(self, text)
|
||||||
_('Attach Website? (Experimental)'),
|
|
||||||
_("Are you sure you want to attach\n'{}'?").format(text),
|
|
||||||
lambda url=text: generic_actions.attach_website(url)
|
|
||||||
)
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(e)
|
logger.error(e)
|
||||||
|
|
||||||
@ -830,58 +790,19 @@ Generate a title following these rules:
|
|||||||
def power_saver_toggled(self, monitor):
|
def power_saver_toggled(self, monitor):
|
||||||
self.banner.set_revealed(monitor.get_power_saver_enabled() and self.powersaver_warning_switch.get_active())
|
self.banner.set_revealed(monitor.get_power_saver_enabled() and self.powersaver_warning_switch.get_active())
|
||||||
|
|
||||||
def remote_switched(self, switch, state):
|
def prepare_alpaca(self, local_port:int, remote_url:str, remote:bool, tweaks:dict, overrides:dict, bearer_token:str, idle_timer_delay:int, save:bool, show_launch_dialog:bool):
|
||||||
def local_instance_process():
|
#Show launch dialog
|
||||||
sensitive_elements = [switch, self.tweaks_group, self.instance_page, self.send_button, self.attachment_button]
|
if show_launch_dialog:
|
||||||
|
GLib.idle_add(self.launch_dialog.present, self)
|
||||||
[element.set_sensitive(False) for element in sensitive_elements]
|
|
||||||
self.get_application().lookup_action('manage_models').set_enabled(False)
|
|
||||||
self.title_stack.set_visible_child_name('loading')
|
|
||||||
|
|
||||||
self.ollama_instance.remote = False
|
|
||||||
self.ollama_instance.start()
|
|
||||||
self.model_manager.update_local_list()
|
|
||||||
self.save_server_config()
|
|
||||||
|
|
||||||
[element.set_sensitive(True) for element in sensitive_elements]
|
|
||||||
self.get_application().lookup_action('manage_models').set_enabled(True)
|
|
||||||
self.title_stack.set_visible_child_name('model_selector' if len(self.model_manager.get_model_list()) > 0 else 'no_models')
|
|
||||||
|
|
||||||
if state:
|
|
||||||
options = {
|
|
||||||
_("Cancel"): {"callback": lambda *_: self.remote_connection_switch.set_active(False)},
|
|
||||||
_("Connect"): {"callback": lambda url, bearer: generic_actions.connect_remote(url, bearer), "appearance": "suggested"}
|
|
||||||
}
|
|
||||||
entries = [
|
|
||||||
{"text": self.ollama_instance.remote_url, "placeholder": _('Server URL')},
|
|
||||||
{"text": self.ollama_instance.bearer_token, "placeholder": _('Bearer Token (Optional)')}
|
|
||||||
]
|
|
||||||
dialog_widget.Entry(
|
|
||||||
_('Connect Remote Instance'),
|
|
||||||
_('Enter instance information to continue'),
|
|
||||||
list(options)[0],
|
|
||||||
options,
|
|
||||||
entries
|
|
||||||
)
|
|
||||||
elif self.ollama_instance.remote:
|
|
||||||
threading.Thread(target=local_instance_process).start()
|
|
||||||
|
|
||||||
def prepare_alpaca(self, local_port:int, remote_url:str, remote:bool, tweaks:dict, overrides:dict, bearer_token:str, idle_timer_delay:int, save:bool):
|
|
||||||
#Model Manager
|
|
||||||
self.model_manager = model_widget.model_manager_container()
|
|
||||||
self.model_scroller.set_child(self.model_manager)
|
|
||||||
|
|
||||||
#Chat History
|
|
||||||
self.load_history()
|
|
||||||
|
|
||||||
#Instance
|
#Instance
|
||||||
|
self.launch_level_bar.set_value(0)
|
||||||
|
self.launch_status.set_description(_('Loading instance'))
|
||||||
self.ollama_instance = connection_handler.instance(local_port, remote_url, remote, tweaks, overrides, bearer_token, idle_timer_delay)
|
self.ollama_instance = connection_handler.instance(local_port, remote_url, remote, tweaks, overrides, bearer_token, idle_timer_delay)
|
||||||
|
|
||||||
#Model Manager P.2
|
|
||||||
self.model_manager.update_available_list()
|
|
||||||
self.model_manager.update_local_list()
|
|
||||||
|
|
||||||
#User Preferences
|
#User Preferences
|
||||||
|
self.launch_level_bar.set_value(1)
|
||||||
|
self.launch_status.set_description(_('Applying user preferences'))
|
||||||
for element in list(list(list(list(self.tweaks_group)[0])[1])[0]):
|
for element in list(list(list(list(self.tweaks_group)[0])[1])[0]):
|
||||||
if element.get_name() in self.ollama_instance.tweaks:
|
if element.get_name() in self.ollama_instance.tweaks:
|
||||||
element.set_value(self.ollama_instance.tweaks[element.get_name()])
|
element.set_value(self.ollama_instance.tweaks[element.get_name()])
|
||||||
@ -891,29 +812,42 @@ Generate a title following these rules:
|
|||||||
element.set_text(self.ollama_instance.overrides[element.get_name()])
|
element.set_text(self.ollama_instance.overrides[element.get_name()])
|
||||||
|
|
||||||
self.set_hide_on_close(self.background_switch.get_active())
|
self.set_hide_on_close(self.background_switch.get_active())
|
||||||
self.instance_idle_timer.set_value(self.ollama_instance.idle_timer_delay)
|
self.remote_connection_entry.set_text(self.ollama_instance.remote_url)
|
||||||
|
self.remote_connection_switch.set_sensitive(self.remote_connection_entry.get_text())
|
||||||
|
self.remote_bearer_token_entry.set_text(self.ollama_instance.bearer_token)
|
||||||
self.remote_connection_switch.set_active(self.ollama_instance.remote)
|
self.remote_connection_switch.set_active(self.ollama_instance.remote)
|
||||||
self.remote_connection_switch.get_activatable_widget().connect('state-set', self.remote_switched)
|
self.instance_idle_timer.set_value(self.ollama_instance.idle_timer_delay)
|
||||||
|
|
||||||
|
#Model Manager
|
||||||
|
self.model_manager = model_widget.model_manager_container()
|
||||||
|
self.model_scroller.set_child(self.model_manager)
|
||||||
|
self.launch_level_bar.set_value(2)
|
||||||
|
self.launch_status.set_description(_('Updating list of local models'))
|
||||||
|
self.model_manager.update_local_list()
|
||||||
|
self.launch_level_bar.set_value(3)
|
||||||
|
self.launch_status.set_description(_('Updating list of available models'))
|
||||||
|
self.model_manager.update_available_list()
|
||||||
|
|
||||||
|
#Chat History
|
||||||
|
self.launch_level_bar.set_value(4)
|
||||||
|
self.launch_status.set_description(_('Loading chats'))
|
||||||
|
GLib.idle_add(self.load_history)
|
||||||
|
self.launch_level_bar.set_value(5)
|
||||||
|
|
||||||
#Save preferences
|
#Save preferences
|
||||||
if save:
|
if save:
|
||||||
self.save_server_config()
|
self.save_server_config()
|
||||||
self.send_button.set_sensitive(True)
|
|
||||||
self.attachment_button.set_sensitive(True)
|
time.sleep(.5) #This is to prevent errors with gtk creating the launch dialog and closing it too quickly
|
||||||
self.remote_connection_switch.set_sensitive(True)
|
#Close launch dialog
|
||||||
self.tweaks_group.set_sensitive(True)
|
GLib.idle_add(self.launch_dialog.force_close)
|
||||||
self.instance_page.set_sensitive(True)
|
|
||||||
self.get_application().lookup_action('manage_models').set_enabled(True)
|
|
||||||
|
|
||||||
def __init__(self, **kwargs):
|
def __init__(self, **kwargs):
|
||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
self.message_searchbar.connect('notify::search-mode-enabled', lambda *_: self.message_search_button.set_active(self.message_searchbar.get_search_mode()))
|
|
||||||
message_widget.window = self
|
message_widget.window = self
|
||||||
chat_widget.window = self
|
chat_widget.window = self
|
||||||
model_widget.window = self
|
model_widget.window = self
|
||||||
dialog_widget.window = self
|
|
||||||
terminal_widget.window = self
|
|
||||||
generic_actions.window = self
|
|
||||||
connection_handler.window = self
|
connection_handler.window = self
|
||||||
|
|
||||||
drop_target = Gtk.DropTarget.new(Gdk.FileList, Gdk.DragAction.COPY)
|
drop_target = Gtk.DropTarget.new(Gdk.FileList, Gdk.DragAction.COPY)
|
||||||
@ -933,11 +867,11 @@ Generate a title following these rules:
|
|||||||
|
|
||||||
universal_actions = {
|
universal_actions = {
|
||||||
'new_chat': [lambda *_: self.chat_list_box.new_chat(), ['<primary>n']],
|
'new_chat': [lambda *_: self.chat_list_box.new_chat(), ['<primary>n']],
|
||||||
'clear': [lambda *i: dialog_widget.simple(_('Clear Chat?'), _('Are you sure you want to clear the chat?'), self.chat_list_box.get_current_chat().clear_chat, _('Clear')), ['<primary>e']],
|
'clear': [lambda *_: dialogs.clear_chat(self), ['<primary>e']],
|
||||||
'import_chat': [lambda *_: self.chat_list_box.import_chat(), ['<primary>i']],
|
'import_chat': [lambda *_: self.chat_list_box.import_chat(), ['<primary>i']],
|
||||||
'create_model_from_existing': [lambda *i: dialog_widget.simple_dropdown(_('Select Model'), _('This model will be used as the base for the new model'), lambda model: self.create_model(model, False), [self.convert_model_name(model, 0) for model in self.model_manager.get_model_list()])],
|
'create_model_from_existing': [lambda *_: dialogs.create_model_from_existing(self)],
|
||||||
'create_model_from_file': [lambda *i, file_filter=self.file_filter_gguf: dialog_widget.simple_file(file_filter, lambda file: self.create_model(file.get_path(), True))],
|
'create_model_from_file': [lambda *_: dialogs.create_model_from_file(self)],
|
||||||
'create_model_from_name': [lambda *i: dialog_widget.simple_entry(_('Pull Model'), _('Input the name of the model in this format\nname:tag'), lambda model: threading.Thread(target=self.model_manager.pull_model, kwargs={"model_name": model}).start(), {'placeholder': 'llama3.2:latest'})],
|
'create_model_from_name': [lambda *_: dialogs.create_model_from_name(self)],
|
||||||
'duplicate_chat': [self.chat_actions],
|
'duplicate_chat': [self.chat_actions],
|
||||||
'duplicate_current_chat': [self.current_chat_actions],
|
'duplicate_current_chat': [self.current_chat_actions],
|
||||||
'delete_chat': [self.chat_actions],
|
'delete_chat': [self.chat_actions],
|
||||||
@ -947,21 +881,16 @@ Generate a title following these rules:
|
|||||||
'export_chat': [self.chat_actions],
|
'export_chat': [self.chat_actions],
|
||||||
'export_current_chat': [self.current_chat_actions],
|
'export_current_chat': [self.current_chat_actions],
|
||||||
'toggle_sidebar': [lambda *_: self.split_view_overlay.set_show_sidebar(not self.split_view_overlay.get_show_sidebar()), ['F9']],
|
'toggle_sidebar': [lambda *_: self.split_view_overlay.set_show_sidebar(not self.split_view_overlay.get_show_sidebar()), ['F9']],
|
||||||
'manage_models': [lambda *_: self.manage_models_dialog.present(self), ['<primary>m']],
|
'manage_models': [lambda *_: self.manage_models_dialog.present(self), ['<primary>m']]
|
||||||
'search_messages': [lambda *_: self.message_searchbar.set_search_mode(not self.message_searchbar.get_search_mode()), ['<primary>f']]
|
|
||||||
}
|
}
|
||||||
|
|
||||||
for action_name, data in universal_actions.items():
|
for action_name, data in universal_actions.items():
|
||||||
self.get_application().create_action(action_name, data[0], data[1] if len(data) > 1 else None)
|
self.get_application().create_action(action_name, data[0], data[1] if len(data) > 1 else None)
|
||||||
|
|
||||||
self.get_application().lookup_action('manage_models').set_enabled(False)
|
self.file_preview_remove_button.connect('clicked', lambda button : dialogs.remove_attached_file(self, button.get_name()))
|
||||||
self.remote_connection_switch.set_sensitive(False)
|
self.attachment_button.connect("clicked", lambda button, file_filter=self.file_filter_attachments: dialogs.attach_file(self, file_filter))
|
||||||
self.tweaks_group.set_sensitive(False)
|
|
||||||
self.instance_page.set_sensitive(False)
|
|
||||||
|
|
||||||
self.file_preview_remove_button.connect('clicked', lambda button : dialog_widget.simple(_('Remove Attachment?'), _("Are you sure you want to remove attachment?"), lambda button=button: self.remove_attached_file(button.get_name()), _('Remove'), 'destructive'))
|
|
||||||
self.attachment_button.connect("clicked", lambda button, file_filter=self.file_filter_attachments: dialog_widget.simple_file(file_filter, generic_actions.attach_file))
|
|
||||||
self.create_model_name.get_delegate().connect("insert-text", lambda *_: self.check_alphanumeric(*_, ['-', '.', '_']))
|
self.create_model_name.get_delegate().connect("insert-text", lambda *_: self.check_alphanumeric(*_, ['-', '.', '_']))
|
||||||
|
self.remote_connection_entry.connect("entry-activated", lambda entry : entry.set_css_classes([]))
|
||||||
self.set_focus(self.message_text_view)
|
self.set_focus(self.message_text_view)
|
||||||
if os.path.exists(os.path.join(config_dir, "server.json")):
|
if os.path.exists(os.path.join(config_dir, "server.json")):
|
||||||
try:
|
try:
|
||||||
@ -973,16 +902,12 @@ Generate a title following these rules:
|
|||||||
if 'powersaver_warning' not in data:
|
if 'powersaver_warning' not in data:
|
||||||
data['powersaver_warning'] = True
|
data['powersaver_warning'] = True
|
||||||
self.powersaver_warning_switch.set_active(data['powersaver_warning'])
|
self.powersaver_warning_switch.set_active(data['powersaver_warning'])
|
||||||
threading.Thread(target=self.prepare_alpaca, args=(data['local_port'], data['remote_url'], data['run_remote'], data['model_tweaks'], data['ollama_overrides'], data['remote_bearer_token'], round(data['idle_timer']), False)).start()
|
threading.Thread(target=self.prepare_alpaca, args=(data['local_port'], data['remote_url'], data['run_remote'], data['model_tweaks'], data['ollama_overrides'], data['remote_bearer_token'], round(data['idle_timer']), False, True)).start()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(e)
|
logger.error(e)
|
||||||
threading.Thread(target=self.prepare_alpaca, args=(11435, '', False, {'temperature': 0.7, 'seed': 0, 'keep_alive': 5}, {}, '', 0, True)).start()
|
threading.Thread(target=self.prepare_alpaca, args=(11435, '', False, {'temperature': 0.7, 'seed': 0, 'keep_alive': 5}, {}, '', 0, True, True)).start()
|
||||||
self.powersaver_warning_switch.set_active(True)
|
self.powersaver_warning_switch.set_active(True)
|
||||||
else:
|
else:
|
||||||
if shutil.which('ollama'):
|
|
||||||
threading.Thread(target=self.prepare_alpaca, args=(11435, '', False, {'temperature': 0.7, 'seed': 0, 'keep_alive': 5}, {}, '', 0, True)).start()
|
|
||||||
else:
|
|
||||||
threading.Thread(target=self.prepare_alpaca, args=(11435, 'http://0.0.0.0:11434', True, {'temperature': 0.7, 'seed': 0, 'keep_alive': 5}, {}, '', 0, True)).start()
|
|
||||||
self.welcome_dialog.present(self)
|
self.welcome_dialog.present(self)
|
||||||
|
|
||||||
if self.powersaver_warning_switch.get_active():
|
if self.powersaver_warning_switch.get_active():
|
||||||
|
195
src/window.ui
195
src/window.ui
@ -6,7 +6,7 @@
|
|||||||
<signal name="close-request" handler="closing_app"/>
|
<signal name="close-request" handler="closing_app"/>
|
||||||
<property name="resizable">True</property>
|
<property name="resizable">True</property>
|
||||||
<property name="width-request">400</property>
|
<property name="width-request">400</property>
|
||||||
<property name="height-request">600</property>
|
<property name="height-request">400</property>
|
||||||
<property name="default-width">1300</property>
|
<property name="default-width">1300</property>
|
||||||
<property name="default-height">800</property>
|
<property name="default-height">800</property>
|
||||||
<property name="title">Alpaca</property>
|
<property name="title">Alpaca</property>
|
||||||
@ -14,13 +14,12 @@
|
|||||||
<object class="AdwBreakpoint">
|
<object class="AdwBreakpoint">
|
||||||
<condition>max-width: 690sp</condition>
|
<condition>max-width: 690sp</condition>
|
||||||
<setter object="split_view_overlay" property="collapsed">true</setter>
|
<setter object="split_view_overlay" property="collapsed">true</setter>
|
||||||
<setter object="terminal_dialog" property="width-request">400</setter>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<property name="content">
|
<property name="content">
|
||||||
<object class="AdwOverlaySplitView" id="split_view_overlay">
|
<object class="AdwOverlaySplitView" id="split_view_overlay">
|
||||||
<property name="show-sidebar" bind-source="show_sidebar_button" bind-property="active" bind-flags="sync-create"/>
|
<property name="show-sidebar" bind-source="show_sidebar_button" bind-property="active" bind-flags="sync-create"/>
|
||||||
<property name="sidebar-width-fraction">0.3</property>
|
<property name="sidebar-width-fraction">0.4</property>
|
||||||
<property name="sidebar">
|
<property name="sidebar">
|
||||||
<object class="AdwToolbarView">
|
<object class="AdwToolbarView">
|
||||||
<child type="top">
|
<child type="top">
|
||||||
@ -55,9 +54,8 @@
|
|||||||
</property>
|
</property>
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwToolbarView">
|
<object class="AdwToolbarView">
|
||||||
<property name="height-request">140</property>
|
|
||||||
<child type="top">
|
<child type="top">
|
||||||
<object class="AdwHeaderBar">
|
<object class="AdwHeaderBar" id="header_bar">
|
||||||
<child type="start">
|
<child type="start">
|
||||||
<object class="GtkToggleButton" id="show_sidebar_button">
|
<object class="GtkToggleButton" id="show_sidebar_button">
|
||||||
<property name="icon-name">sidebar-show-symbolic</property>
|
<property name="icon-name">sidebar-show-symbolic</property>
|
||||||
@ -65,52 +63,6 @@
|
|||||||
<property name="active" bind-source="split_view_overlay" bind-property="show-sidebar" bind-flags="sync-create"/>
|
<property name="active" bind-source="split_view_overlay" bind-property="show-sidebar" bind-flags="sync-create"/>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child type="start">
|
|
||||||
<object class="GtkToggleButton" id="message_search_button">
|
|
||||||
<property name="icon-name">edit-find-symbolic</property>
|
|
||||||
<property name="tooltip-text" translatable="yes">Search Messages</property>
|
|
||||||
<signal name="clicked" handler="message_search_toggle"/>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child type="title">
|
|
||||||
<object class="GtkStack" id="title_stack">
|
|
||||||
<property name="transition_duration">100</property>
|
|
||||||
<property name="transition_type">1</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkStackPage">
|
|
||||||
<property name="name">loading</property>
|
|
||||||
<property name="child">
|
|
||||||
<object class="GtkBox">
|
|
||||||
<property name="orientation">0</property>
|
|
||||||
<property name="spacing">10</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkSpinner">
|
|
||||||
<property name="spinning">true</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="GtkLabel">
|
|
||||||
<property name="label" translatable="yes">Loading Instance</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
|
||||||
<object class="GtkStackPage">
|
|
||||||
<property name="name">no_models</property>
|
|
||||||
<property name="child">
|
|
||||||
<object class="GtkButton">
|
|
||||||
<property name="label" translatable="yes">Manage Models</property>
|
|
||||||
<property name="tooltip-text" translatable="yes">Manage Models</property>
|
|
||||||
<property name="action-name">app.manage_models</property>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child type="end">
|
<child type="end">
|
||||||
<object class="GtkMenuButton" id="secondary_menu_button">
|
<object class="GtkMenuButton" id="secondary_menu_button">
|
||||||
<property name="primary">False</property>
|
<property name="primary">False</property>
|
||||||
@ -121,24 +73,6 @@
|
|||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child type="top">
|
|
||||||
<object class="GtkSearchBar" id="message_searchbar">
|
|
||||||
<accessibility>
|
|
||||||
<property name="label" translatable="yes">Message search bar</property>
|
|
||||||
</accessibility>
|
|
||||||
<property name="key-capture-widget">AlpacaWindow</property>
|
|
||||||
<child>
|
|
||||||
<object class="GtkSearchEntry" id="searchentry_messages">
|
|
||||||
<signal name="search-changed" handler="message_search_changed"/>
|
|
||||||
<property name="search-delay">200</property>
|
|
||||||
<property name="placeholder-text" translatable="yes">Search messages</property>
|
|
||||||
<accessibility>
|
|
||||||
<property name="label" translatable="yes">Search messages</property>
|
|
||||||
</accessibility>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<property name="content">
|
<property name="content">
|
||||||
<object class="GtkBox"><!--ACTUAL CONTENT-->
|
<object class="GtkBox"><!--ACTUAL CONTENT-->
|
||||||
<property name="orientation">1</property>
|
<property name="orientation">1</property>
|
||||||
@ -193,7 +127,6 @@
|
|||||||
<object class="GtkButton" id="attachment_button">
|
<object class="GtkButton" id="attachment_button">
|
||||||
<property name="vexpand">false</property>
|
<property name="vexpand">false</property>
|
||||||
<property name="valign">3</property>
|
<property name="valign">3</property>
|
||||||
<property name="sensitive">false</property>
|
|
||||||
<property name="tooltip-text" translatable="yes">Attach File</property>
|
<property name="tooltip-text" translatable="yes">Attach File</property>
|
||||||
<style>
|
<style>
|
||||||
<class name="circular"/>
|
<class name="circular"/>
|
||||||
@ -224,7 +157,6 @@
|
|||||||
<signal name="paste-clipboard" handler="on_clipboard_paste"/>
|
<signal name="paste-clipboard" handler="on_clipboard_paste"/>
|
||||||
<style>
|
<style>
|
||||||
<class name="message_text_view"/>
|
<class name="message_text_view"/>
|
||||||
<class name="undershoot-bottom"/>
|
|
||||||
</style>
|
</style>
|
||||||
<property name="wrap-mode">word</property>
|
<property name="wrap-mode">word</property>
|
||||||
<property name="top-margin">10</property>
|
<property name="top-margin">10</property>
|
||||||
@ -246,7 +178,6 @@
|
|||||||
<property name="vexpand">false</property>
|
<property name="vexpand">false</property>
|
||||||
<property name="valign">3</property>
|
<property name="valign">3</property>
|
||||||
<property name="tooltip-text" translatable="yes">Send Message</property>
|
<property name="tooltip-text" translatable="yes">Send Message</property>
|
||||||
<property name="sensitive">false</property>
|
|
||||||
<style>
|
<style>
|
||||||
<class name="accent"/>
|
<class name="accent"/>
|
||||||
<class name="circular"/>
|
<class name="circular"/>
|
||||||
@ -304,9 +235,24 @@
|
|||||||
<object class="AdwPreferencesGroup">
|
<object class="AdwPreferencesGroup">
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwSwitchRow" id="remote_connection_switch">
|
<object class="AdwSwitchRow" id="remote_connection_switch">
|
||||||
|
<signal name="notify::active" handler="change_remote_connection"/>
|
||||||
<property name="title" translatable="yes">Use Remote Connection to Ollama</property>
|
<property name="title" translatable="yes">Use Remote Connection to Ollama</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
|
<child>
|
||||||
|
<object class="AdwEntryRow" id="remote_connection_entry">
|
||||||
|
<signal name="apply" handler="change_remote_url"/>
|
||||||
|
<property name="title" translatable="yes">URL of Remote Instance</property>
|
||||||
|
<property name="show-apply-button">true</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
<child>
|
||||||
|
<object class="AdwEntryRow" id="remote_bearer_token_entry">
|
||||||
|
<signal name="apply" handler="change_remote_bearer_token"/>
|
||||||
|
<property name="title" translatable="yes">Bearer Token (Optional)</property>
|
||||||
|
<property name="show-apply-button">true</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
@ -470,43 +416,26 @@
|
|||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
|
||||||
<object class="AdwPreferencesGroup">
|
|
||||||
<child>
|
|
||||||
<object class="GtkLabel" id="ollama_information_label">
|
|
||||||
<property name="wrap">true</property>
|
|
||||||
<property name="use-markup">true</property>
|
|
||||||
<property name="label" translatable="yes">Integrated Ollama instance is not running</property>
|
|
||||||
<property name="justify">2</property>
|
|
||||||
<style>
|
|
||||||
<class name="dim-label"/>
|
|
||||||
</style>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
</object>
|
</object>
|
||||||
|
|
||||||
<object class="AdwDialog" id="terminal_dialog">
|
<object class="AdwDialog" id="launch_dialog">
|
||||||
<accessibility>
|
<accessibility>
|
||||||
<property name="label" translatable="yes">Manage models dialog</property>
|
<property name="label" translatable="yes">Loading Alpaca dialog</property>
|
||||||
</accessibility>
|
</accessibility>
|
||||||
<property name="title" translatable="yes">Terminal</property>
|
<property name="width-request">400</property>
|
||||||
<property name="can-close">true</property>
|
<property name="can-close">false</property>
|
||||||
<property name="width-request">600</property>
|
|
||||||
<property name="height-request">600</property>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwToolbarView">
|
<object class="AdwStatusPage" id="launch_status">
|
||||||
<style>
|
<property name="icon_name">com.jeffser.Alpaca</property>
|
||||||
<class name="osd"/>
|
<property name="title" translatable="yes">Loading Alpaca...</property>
|
||||||
</style>
|
<property name="child">
|
||||||
<child type="top">
|
<object class="GtkLevelBar" id="launch_level_bar">
|
||||||
<object class="AdwHeaderBar"/>
|
<property name="mode">1</property>
|
||||||
</child>
|
<property name="min-value">0</property>
|
||||||
<property name="content">
|
<property name="max-value">5</property>
|
||||||
<object class="GtkScrolledWindow" id="terminal_scroller"/>
|
</object>
|
||||||
</property>
|
</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
@ -572,7 +501,6 @@
|
|||||||
<property name="vexpand">true</property>
|
<property name="vexpand">true</property>
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkScrolledWindow" id="model_scroller">
|
<object class="GtkScrolledWindow" id="model_scroller">
|
||||||
<property name="hscrollbar-policy">2</property>
|
|
||||||
<property name="hexpand">true</property>
|
<property name="hexpand">true</property>
|
||||||
<property name="vexpand">true</property>
|
<property name="vexpand">true</property>
|
||||||
</object>
|
</object>
|
||||||
@ -665,41 +593,6 @@
|
|||||||
</property>
|
</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
|
||||||
<object class="AdwNavigationPage">
|
|
||||||
<property name="title" translatable="yes">Model Details</property>
|
|
||||||
<property name="tag">model_information</property>
|
|
||||||
<property name="child">
|
|
||||||
<object class="AdwToolbarView">
|
|
||||||
<child type="top">
|
|
||||||
<object class="AdwHeaderBar">
|
|
||||||
<child type="start">
|
|
||||||
<object class="GtkButton" id="model_detail_create_button">
|
|
||||||
<signal name="clicked" handler="model_detail_create_button_clicked"/>
|
|
||||||
<property name="icon-name">edit-copy-symbolic</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<property name="content">
|
|
||||||
<object class="GtkScrolledWindow">
|
|
||||||
<property name="vexpand">true</property>
|
|
||||||
<property name="hexpand">true</property>
|
|
||||||
<child>
|
|
||||||
<object class="AdwStatusPage" id="model_detail_page">
|
|
||||||
<property name="icon-name">brain-augemnted-symbolic</property>
|
|
||||||
<property name="description">text</property>
|
|
||||||
<style>
|
|
||||||
<class name="compact"/>
|
|
||||||
</style>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
</object>
|
|
||||||
</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="AdwNavigationPage">
|
<object class="AdwNavigationPage">
|
||||||
<property name="title" translatable="yes">Create Model</property>
|
<property name="title" translatable="yes">Create Model</property>
|
||||||
@ -707,7 +600,14 @@
|
|||||||
<property name="child">
|
<property name="child">
|
||||||
<object class="AdwToolbarView">
|
<object class="AdwToolbarView">
|
||||||
<child type="top">
|
<child type="top">
|
||||||
<object class="AdwHeaderBar"/>
|
<object class="AdwHeaderBar">
|
||||||
|
<child type="start">
|
||||||
|
<object class="GtkButton">
|
||||||
|
<signal name="clicked" handler="link_button_handler"/>
|
||||||
|
<property name="icon-name">globe-symbolic</property>
|
||||||
|
</object>
|
||||||
|
</child>
|
||||||
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<property name="content">
|
<property name="content">
|
||||||
<object class="GtkScrolledWindow">
|
<object class="GtkScrolledWindow">
|
||||||
@ -782,9 +682,6 @@
|
|||||||
<object class="GtkScrolledWindow">
|
<object class="GtkScrolledWindow">
|
||||||
<property name="margin-start">10</property>
|
<property name="margin-start">10</property>
|
||||||
<property name="margin-end">10</property>
|
<property name="margin-end">10</property>
|
||||||
<style>
|
|
||||||
<class name="undershoot-bottom"/>
|
|
||||||
</style>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkTextView" id="create_model_modelfile">
|
<object class="GtkTextView" id="create_model_modelfile">
|
||||||
<style>
|
<style>
|
||||||
@ -884,12 +781,14 @@
|
|||||||
<child>
|
<child>
|
||||||
<object class="GtkBox">
|
<object class="GtkBox">
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkLabel" id="file_preview_text_label">
|
<object class="GtkTextView" id="file_preview_text_view">
|
||||||
<property name="margin-top">12</property>
|
<property name="margin-top">12</property>
|
||||||
<property name="margin-bottom">12</property>
|
<property name="margin-bottom">12</property>
|
||||||
<property name="margin-start">12</property>
|
<property name="margin-start">12</property>
|
||||||
<property name="margin-end">12</property>
|
<property name="margin-end">12</property>
|
||||||
<property name="selectable">true</property>
|
<property name="hexpand">true</property>
|
||||||
|
<property name="vexpand">true</property>
|
||||||
|
<property name="editable">false</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
@ -1123,16 +1022,10 @@
|
|||||||
<child>
|
<child>
|
||||||
<object class="GtkShortcutsGroup">
|
<object class="GtkShortcutsGroup">
|
||||||
<property name="title" translatable="yes">General</property>
|
<property name="title" translatable="yes">General</property>
|
||||||
<child>
|
|
||||||
<object class="GtkShortcutsShortcut">
|
|
||||||
<property name="accelerator"><ctrl>Q</property>
|
|
||||||
<property name="title" translatable="yes">Close application</property>
|
|
||||||
</object>
|
|
||||||
</child>
|
|
||||||
<child>
|
<child>
|
||||||
<object class="GtkShortcutsShortcut">
|
<object class="GtkShortcutsShortcut">
|
||||||
<property name="accelerator"><ctrl>W</property>
|
<property name="accelerator"><ctrl>W</property>
|
||||||
<property name="title" translatable="yes">Delete current chat</property>
|
<property name="title" translatable="yes">Close application</property>
|
||||||
</object>
|
</object>
|
||||||
</child>
|
</child>
|
||||||
<child>
|
<child>
|
||||||
|
@ -24,7 +24,3 @@ echo "Updating Ukrainian"
|
|||||||
msgmerge --no-fuzzy-matching -U po/uk.po po/alpaca.pot
|
msgmerge --no-fuzzy-matching -U po/uk.po po/alpaca.pot
|
||||||
echo "Updating German"
|
echo "Updating German"
|
||||||
msgmerge --no-fuzzy-matching -U po/de.po po/alpaca.pot
|
msgmerge --no-fuzzy-matching -U po/de.po po/alpaca.pot
|
||||||
echo "Updating Hebrew"
|
|
||||||
msgmerge --no-fuzzy-matching -U po/he.po po/alpaca.pot
|
|
||||||
echo "Updating Telugu"
|
|
||||||
msgmerge --no-fuzzy-matching -U po/te.po po/alpaca.pot
|
|
Loading…
x
Reference in New Issue
Block a user