Compare commits

..

No commits in common. "main" and "0.4.0" have entirely different histories.
main ... 0.4.0

92 changed files with 1288 additions and 82870 deletions

1
.github/FUNDING.yml vendored
View File

@ -1,7 +1,6 @@
# These are supported funding model platforms
github: jeffser # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2]
#ko_fi: jeffser
#patreon: # Replace with a single Patreon username
#open_collective: # Replace with a single Open Collective username
#ko_fi: # Replace with a single Ko-fi username

View File

@ -1,22 +0,0 @@
---
name: Bug report
about: Something is wrong
title: ''
labels: bug
assignees: ''
---
<!--Please be aware that GNOME Code of Conduct applies to Alpaca, https://conduct.gnome.org/-->
**Describe the bug**
A clear and concise description of what the bug is.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Debugging information**
```
Please paste here the debugging information available at 'About Alpaca' > 'Troubleshooting' > 'Debugging Information'
```

View File

@ -1,20 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: enhancement
assignees: ''
---
<!--Please be aware that GNOME Code of Conduct applies to Alpaca, https://conduct.gnome.org/-->
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@ -1,18 +0,0 @@
# .github/workflows/flatpak-build.yml
on:
workflow_dispatch:
name: Flatpak Build
jobs:
flatpak:
name: "Flatpak"
runs-on: ubuntu-latest
container:
image: bilelmoussaoui/flatpak-github-actions:gnome-46
options: --privileged
steps:
- uses: actions/checkout@v4
- uses: flatpak/flatpak-github-actions/flatpak-builder@v6
with:
bundle: com.jeffser.Alpaca.flatpak
manifest-path: com.jeffser.Alpaca.json
cache-key: flatpak-builder-${{ github.sha }}

View File

@ -1,24 +0,0 @@
name: Pylint
on:
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.11"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install pylint
- name: Analysing the code with pylint
run: |
pylint --rcfile=.pylintrc $(git ls-files '*.py' | grep -v 'src/available_models_descriptions.py')

View File

@ -1,14 +0,0 @@
[MASTER]
[MESSAGES CONTROL]
disable=undefined-variable, line-too-long, missing-function-docstring, consider-using-f-string, import-error
[FORMAT]
max-line-length=200
# Reasons for removing some checks:
# undefined-variable: _() is used by the translator on build time but it is not defined on the scripts
# line-too-long: I... I'm too lazy to make the lines shorter, maybe later
# missing-function-docstring I'm not adding a docstring to all the functions, most are self explanatory
# consider-using-f-string I can't use f-string because of the translator
# import-error The linter doesn't have access to all the libraries that the project itself does

View File

@ -1,34 +0,0 @@
<Project xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns:foaf="http://xmlns.com/foaf/0.1/"
xmlns:gnome="http://api.gnome.org/doap-extensions#"
xmlns="http://usefulinc.com/ns/doap#">
<name xml:lang="en">Alpaca</name>
<shortdesc xml:lang="en">An Ollama client made with GTK4 and Adwaita</shortdesc>
<homepage rdf:resource="https://jeffser.com/alpaca" />
<bug-database rdf:resource="https://github.com/Jeffser/Alpaca/issues"/>
<programming-language>Python</programming-language>
<platform>GTK 4</platform>
<platform>Libadwaita</platform>
<maintainer>
<foaf:Person>
<foaf:name>Jeffry Samuel</foaf:name>
<foaf:mbox rdf:resource="mailto:jeffrysamuer@gmail.com"/>
<foaf:account>
<foaf:OnlineAccount>
<foaf:accountServiceHomepage rdf:resource="https://github.com"/>
<foaf:accountName>jeffser</foaf:accountName>
</foaf:OnlineAccount>
</foaf:account>
<foaf:account>
<foaf:OnlineAccount>
<foaf:accountServiceHomepage rdf:resource="https://gitlab.gnome.org"/>
<foaf:accountName>jeffser</foaf:accountName>
</foaf:OnlineAccount>
</foaf:account>
</foaf:Person>
</maintainer>
</Project>

View File

@ -1,4 +0,0 @@
Alpaca follows [GNOME's code of conduct](https://conduct.gnome.org/), please make sure to read it before interacting in any way with this repository.
To report any misconduct please reach out via private message on
- X (formally Twitter): [@jeffrysamuer](https://x.com/jeffrysamuer)
- Mastodon: [@jeffser@floss.social](https://floss.social/@jeffser)

View File

@ -1,30 +0,0 @@
# Contributing Rules
## Translations
If you want to translate or contribute on existing translations please read [this discussion](https://github.com/Jeffser/Alpaca/discussions/153).
## Code
1) Before contributing code make sure there's an open [issue](https://github.com/Jeffser/Alpaca/issues) for that particular problem or feature.
2) Ask to contribute on the responses to the issue.
3) Wait for [my](https://github.com/Jeffser) approval, I might have already started working on that issue.
4) Test your code before submitting a pull request.
## Q&A
### Do I need to comment my code?
There's no need to add comments if the code is easy to read by itself.
### What if I need help or I don't understand the existing code?
You can reach out on the issue, I'll try to answer as soon as possible.
### What IDE should I use?
I use Gnome Builder but you can use whatever you want.
### Can I be credited?
You might be credited on the GitHub repository in the [thanks](https://github.com/Jeffser/Alpaca/blob/main/README.md#thanks) section of the README.

116
README.md
View File

@ -1,108 +1,50 @@
<p align="center"><img src="https://jeffser.com/images/alpaca/logo.svg"></p>
<img src="https://jeffser.com/images/alpaca/logo.svg">
# Alpaca
<a href='https://flathub.org/apps/com.jeffser.Alpaca'><img width='240' alt='Download on Flathub' src='https://flathub.org/api/badge?locale=en'/></a>
An [Ollama](https://github.com/ollama/ollama) client made with GTK4 and Adwaita.
Alpaca is an [Ollama](https://github.com/ollama/ollama) client where you can manage and chat with multiple models, Alpaca provides an easy and begginer friendly way of interacting with local AI, everything is open source and powered by Ollama.
---
> [!WARNING]
> This project is not affiliated at all with Ollama, I'm not responsible for any damages to your device or software caused by running code given by any AI models.
> This project is not affiliated at all with Ollama, I'm not responsible for any damages to your device or software caused by running code given by any models.
> [!IMPORTANT]
> Please be aware that [GNOME Code of Conduct](https://conduct.gnome.org) applies to Alpaca before interacting with this repository.
> [!important]
> This is my first GTK4 / Adwaita / Python app, so it might crash and some features are still under development, please report any errors if you can, thank you!
## Features!
- Talk to multiple models in the same conversation
- Pull and delete models from the app
- Image recognition
- Document recognition (plain text files)
- Code highlighting
## Future features!
- Multiple conversations
- Image / document recognition
- Notifications
- Import / Export chats
- Delete / Edit messages
- Regenerate messages
- YouTube recognition (Ask questions about a YouTube video using the transcript)
- Website recognition (Ask questions about a certain website by parsing the url)
- Code highlighting
## Screenies
Login to Ollama instance | Chatting with models | Managing models
:-------------------------:|:-------------------------:|:-------------------------:
![Screenshot from 2024-05-12 19-58-28](https://jeffser.com/images/alpaca/screenie1.png) | ![Screenshot from 2024-05-12 20-01-08](https://jeffser.com/images/alpaca/screenie2.png) | ![Screenshot from 2024-05-12 20-01-31](https://jeffser.com/images/alpaca/screenie3.png)
Normal conversation | Image recognition | Code highlighting | YouTube transcription | Model management
:------------------:|:-----------------:|:-----------------:|:---------------------:|:----------------:
![screenie1](https://jeffser.com/images/alpaca/screenie1.png) | ![screenie2](https://jeffser.com/images/alpaca/screenie2.png) | ![screenie3](https://jeffser.com/images/alpaca/screenie3.png) | ![screenie4](https://jeffser.com/images/alpaca/screenie5.png) | ![screenie5](https://jeffser.com/images/alpaca/screenie6.png)
## Preview
1. Clone repo using Gnome Builder
2. Press the `run` button
## Installation
## Instalation
1. Go to the `releases` page
2. Download the latest flatpak package
3. Open it
### Flathub
## Usage
- You'll need an Ollama instance, I recommend using the [Docker image](https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image)
- Once you open Alpaca it will ask you for a url, if you are using the same computer as the Ollama instance and didn't change the ports you can use the default url.
- You might need a model, you can get one using the box icon at the top of the app, I recommend using phi3 because it is very lightweight but you can use whatever you want (I haven't actually tested all so your mileage may vary).
- Then just start talking! you can mix different models, they all share the same conversation, it's really cool in my opinion.
You can find the latest stable version of the app on [Flathub](https://flathub.org/apps/com.jeffser.Alpaca)
### Flatpak Package
Everytime a new version is published they become available on the [releases page](https://github.com/Jeffser/Alpaca/releases) of the repository
### Snap Package
You can also find the Snap package on the [releases page](https://github.com/Jeffser/Alpaca/releases), to install it run this command:
```BASH
sudo snap install ./{package name} --dangerous
```
The `--dangerous` comes from the package being installed without any involvement of the SnapStore, I'm working on getting the app there, but for now you can test the app this way.
### Building Git Version
Note: This is not recommended since the prerelease versions of the app often present errors and general instability.
1. Clone the project
2. Open with Gnome Builder
3. Press the run button (or export if you want to build a Flatpak package)
## Translators
Language | Contributors
:----------------------|:-----------
🇷🇺 Russian | [Alex K](https://github.com/alexkdeveloper)
🇪🇸 Spanish | [Jeffry Samuel](https://github.com/jeffser)
🇫🇷 French | [Louis Chauvet-Villaret](https://github.com/loulou64490) , [Théo FORTIN](https://github.com/topiga)
🇧🇷 Brazilian Portuguese | [Daimar Stein](https://github.com/not-a-dev-stein) , [Bruno Antunes](https://github.com/antun3s)
🇳🇴 Norwegian | [CounterFlow64](https://github.com/CounterFlow64)
🇮🇳 Bengali | [Aritra Saha](https://github.com/olumolu)
🇨🇳 Simplified Chinese | [Yuehao Sui](https://github.com/8ar10der) , [Aleksana](https://github.com/Aleksanaa)
🇮🇳 Hindi | [Aritra Saha](https://github.com/olumolu)
🇹🇷 Turkish | [YusaBecerikli](https://github.com/YusaBecerikli)
🇺🇦 Ukrainian | [Simon](https://github.com/OriginalSimon)
🇩🇪 German | [Marcel Margenberg](https://github.com/MehrzweckMandala)
🇮🇱 Hebrew | [Yosef Or Boczko](https://github.com/yoseforb)
🇮🇳 Telugu | [Aryan Karamtoth](https://github.com/SpaciousCoder78)
Want to add a language? Visit [this discussion](https://github.com/Jeffser/Alpaca/discussions/153) to get started!
---
## Thanks
- [not-a-dev-stein](https://github.com/not-a-dev-stein) for their help with requesting a new icon and bug reports
- [TylerLaBree](https://github.com/TylerLaBree) for their requests and ideas
- [Imbev](https://github.com/imbev) for their reports and suggestions
- [Nokse](https://github.com/Nokse22) for their contributions to the UI and table rendering
- [Louis Chauvet-Villaret](https://github.com/loulou64490) for their suggestions
- [Aleksana](https://github.com/Aleksanaa) for her help with better handling of directories
- [Gnome Builder Team](https://gitlab.gnome.org/GNOME/gnome-builder) for the awesome IDE I use to develop Alpaca
- Sponsors for giving me enough money to be able to take a ride to my campus every time I need to <3
- Everyone that has shared kind words of encouragement!
---
## Dependencies
- [Requests](https://github.com/psf/requests)
- [Pillow](https://github.com/python-pillow/Pillow)
- [Pypdf](https://github.com/py-pdf/pypdf)
- [Pytube](https://github.com/pytube/pytube)
- [Html2Text](https://github.com/aaronsw/html2text)
- [Ollama](https://github.com/ollama/ollama)
- [Numactl](https://github.com/numactl/numactl)
## About forks
If you want to fork this... I mean, I think it would be better if you start from scratch, my code isn't well documented at all, but if you really want to, please give me some credit, that's all I ask for... And maybe a donation (joke)

View File

@ -1,12 +0,0 @@
# Security Policy
## Supported Packaging
Alpaca only supports [Flatpak](https://flatpak.org/) packaging officially, any other packaging methods might not behave as expected.
## Official Versions
The only ways Alpaca is being distributed officially are:
- [Alpaca's GitHub Repository Releases Page](https://github.com/Jeffser/Alpaca/releases)
- [Flathub](https://flathub.org/apps/com.jeffser.Alpaca)

View File

@ -1,28 +1,16 @@
{
"id" : "com.jeffser.Alpaca",
"runtime" : "org.gnome.Platform",
"runtime-version" : "47",
"runtime-version" : "46",
"sdk" : "org.gnome.Sdk",
"command" : "alpaca",
"finish-args" : [
"--share=network",
"--share=ipc",
"--socket=fallback-x11",
"--device=all",
"--socket=wayland",
"--filesystem=/sys/module/amdgpu:ro",
"--env=LD_LIBRARY_PATH=/app/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/x86_64-linux-gnu/openh264/extra:/usr/lib/sdk/llvm15/lib:/usr/lib/x86_64-linux-gnu/GL/default/lib:/usr/lib/ollama:/app/plugins/AMD/lib/ollama",
"--env=GSK_RENDERER=ngl"
"--device=dri",
"--socket=wayland"
],
"add-extensions": {
"com.jeffser.Alpaca.Plugins": {
"add-ld-path": "/app/plugins/AMD/lib/ollama",
"directory": "plugins",
"no-autodownload": true,
"autodelete": true,
"subdirectories": true
}
},
"cleanup" : [
"/include",
"/lib/pkgconfig",
@ -83,141 +71,6 @@
}
]
},
{
"name": "python3-pypdf",
"buildsystem": "simple",
"build-commands": [
"pip3 install --verbose --exists-action=i --no-index --find-links=\"file://${PWD}\" --prefix=${FLATPAK_DEST} \"pypdf\" --no-build-isolation"
],
"sources": [
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/c9/d1/450b19bbdbb2c802f554312c62ce2a2c0d8744fe14735bc70ad2803578c7/pypdf-4.2.0-py3-none-any.whl",
"sha256": "dc035581664e0ad717e3492acebc1a5fc23dba759e788e3d4a9fc9b1a32e72c1"
}
]
},
{
"name": "python3-pytube",
"buildsystem": "simple",
"build-commands": [
"pip3 install --verbose --exists-action=i --no-index --find-links=\"file://${PWD}\" --prefix=${FLATPAK_DEST} \"pytube\" --no-build-isolation"
],
"sources": [
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/51/64/bcf8632ed2b7a36bbf84a0544885ffa1d0b4bcf25cc0903dba66ec5fdad9/pytube-15.0.0-py3-none-any.whl",
"sha256": "07b9904749e213485780d7eb606e5e5b8e4341aa4dccf699160876da00e12d78"
}
]
},
{
"name": "python3-youtube-transcript-api",
"buildsystem": "simple",
"build-commands": [
"pip3 install --verbose --exists-action=i --no-index --find-links=\"file://${PWD}\" --prefix=${FLATPAK_DEST} \"youtube-transcript-api\" --no-build-isolation"
],
"sources": [
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl",
"sha256": "922820b53db7a7257ffbda3f597266d435245903d80737e34f8a45ff3e3230d8"
},
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/f2/4f/e1808dc01273379acc506d18f1504eb2d299bd4131743b9fc54d7be4df1e/charset_normalizer-3.4.0.tar.gz",
"sha256": "223217c3d4f82c3ac5e29032b3f1c2eb0fb591b72161f86d93f5719079dae93e"
},
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl",
"sha256": "946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"
},
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl",
"sha256": "70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"
},
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl",
"sha256": "ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac"
},
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/52/42/5f57d37d56bdb09722f226ed81cc1bec63942da745aa27266b16b0e16a5d/youtube_transcript_api-0.6.2-py3-none-any.whl",
"sha256": "019dbf265c6a68a0591c513fff25ed5a116ce6525832aefdfb34d4df5567121c"
}
]
},
{
"name": "python3-html2text",
"buildsystem": "simple",
"build-commands": [
"pip3 install --verbose --exists-action=i --no-index --find-links=\"file://${PWD}\" --prefix=${FLATPAK_DEST} \"html2text\" --no-build-isolation"
],
"sources": [
{
"type": "file",
"url": "https://files.pythonhosted.org/packages/1a/43/e1d53588561e533212117750ee79ad0ba02a41f52a08c1df3396bd466c05/html2text-2024.2.26.tar.gz",
"sha256": "05f8e367d15aaabc96415376776cdd11afd5127a77fce6e36afc60c563ca2c32"
}
]
},
{
"name": "ollama",
"buildsystem": "simple",
"build-commands": [
"cp -r --remove-destination * ${FLATPAK_DEST}/",
"mkdir ${FLATPAK_DEST}/plugins"
],
"sources": [
{
"type": "archive",
"url": "https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-amd64.tgz",
"sha256": "f0efa42f7ad77cd156bd48c40cd22109473801e5113173b0ad04f094a4ef522b",
"only-arches": [
"x86_64"
]
},
{
"type": "archive",
"url": "https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-arm64.tgz",
"sha256": "da631cbe4dd2c168dae58d6868b1ff60e881e050f2d07578f2f736e689fec04c",
"only-arches": [
"aarch64"
]
}
]
},
{
"name": "libnuma",
"buildsystem": "autotools",
"build-commands": [
"autoreconf -i",
"make",
"make install"
],
"sources": [
{
"type": "archive",
"url": "https://github.com/numactl/numactl/releases/download/v2.0.18/numactl-2.0.18.tar.gz",
"sha256": "b4fc0956317680579992d7815bc43d0538960dc73aa1dd8ca7e3806e30bc1274"
}
]
},
{
"name": "vte",
"buildsystem": "meson",
"config-opts": ["-Dvapi=false"],
"sources": [
{
"type": "archive",
"url": "https://gitlab.gnome.org/GNOME/vte/-/archive/0.78.0/vte-0.78.0.tar.gz",
"sha256": "82e19d11780fed4b66400f000829ce5ca113efbbfb7975815f26ed93e4c05f2d"
}
]
},
{
"name" : "alpaca",
"builddir" : true,
@ -225,8 +78,7 @@
"sources" : [
{
"type" : "git",
"url": "https://github.com/Jeffser/Alpaca.git",
"branch" : "main"
"url" : "file:///home/tentri/Documents/Alpaca"
}
]
}

View File

@ -5,6 +5,4 @@ Icon=com.jeffser.Alpaca
Terminal=false
Type=Application
Categories=Utility;Development;Chat;
Keywords=ai;ollama;llm
StartupNotify=true
X-Purism-FormFactor=Workstation;Mobile;

View File

@ -5,21 +5,14 @@
<project_license>GPL-3.0-or-later</project_license>
<launchable type="desktop-id">com.jeffser.Alpaca.desktop</launchable>
<name>Alpaca</name>
<summary>Chat with local AI models</summary>
<summary>An Ollama client</summary>
<description>
<p>Chat with multiple AI models</p>
<p>An Ollama client</p>
<p>Features</p>
<ul>
<li>Built in Ollama instance</li>
<li>Talk to multiple models in the same conversation</li>
<li>Pull and delete models from the app</li>
<li>Have multiple conversations</li>
<li>Image recognition (Only available with compatible models)</li>
<li>Plain text documents recognition</li>
<li>Import and export chats</li>
<li>Append YouTube transcripts to the prompt</li>
<li>Append text from a website to the prompt</li>
<li>PDF recognition</li>
</ul>
<p>Disclaimer</p>
<p>This project is not affiliated at all with Ollama, I'm not responsible for any damages to your device or software caused by running code given by any models.</p>
@ -36,14 +29,6 @@
<category>Development</category>
<category>Chat</category>
</categories>
<requires>
<display_length compare="ge">360</display_length>
</requires>
<recommends>
<control>keyboard</control>
<control>pointing</control>
<control>touch</control>
</recommends>
<branding>
<color type="primary" scheme_preference="light">#8cdef5</color>
<color type="primary" scheme_preference="dark">#0f2b78</color>
@ -51,689 +36,22 @@
<screenshots>
<screenshot type="default">

<caption>A normal conversation with an AI Model</caption>
<caption>Welcome dialog</caption>
</screenshot>
<screenshot>

<caption>A conversation involving image recognition</caption>
<caption>A conversation involving multiple models</caption>
</screenshot>
<screenshot>

<caption>A conversation showing code highlighting</caption>
</screenshot>
<screenshot>

<caption>A Python script running inside integrated terminal</caption>
</screenshot>
<screenshot>

<caption>A conversation involving a YouTube video transcript</caption>
</screenshot>
<screenshot>

<caption>Multiple models being downloaded</caption>
<caption>Managing models</caption>
</screenshot>
</screenshots>
<content_rating type="oars-1.1" />
<url type="bugtracker">https://github.com/Jeffser/Alpaca/issues</url>
<url type="homepage">https://jeffser.com/alpaca/</url>
<url type="homepage">https://github.com/Jeffser/Alpaca</url>
<url type="donation">https://github.com/sponsors/Jeffser</url>
<url type="translate">https://github.com/Jeffser/Alpaca/discussions/153</url>
<url type="contribute">https://github.com/Jeffser/Alpaca/discussions/154</url>
<url type="vcs-browser">https://github.com/Jeffser/Alpaca</url>
<releases>
<release version="2.7.0" date="2024-10-15">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.7.0</url>
<description>
<p>New</p>
<ul>
<li>User messages are now compacted into bubbles</li>
</ul>
<p>Fixes</p>
<ul>
<li>Fixed re connection dialog not working when 'use local instance' is selected</li>
<li>Fixed model manager not adapting to large system fonts</li>
</ul>
</description>
</release>
<release version="2.6.5" date="2024-10-13">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.6.5</url>
<description>
<p>New</p>
<ul>
<li>Details page for models</li>
<li>Model selector gets replaced with 'manage models' button when there are no models downloaded</li>
<li>Added warning when model is too big for the device</li>
<li>Added AMD GPU indicator in preferences</li>
</ul>
</description>
</release>
<release version="2.6.0" date="2024-10-11">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.6.0</url>
<description>
<p>New</p>
<ul>
<li>Better system for handling dialogs</li>
<li>Better system for handling instance switching</li>
<li>Remote connection dialog</li>
</ul>
<p>Fixes</p>
<ul>
<li>Fixed: Models get duplicated when switching remote and local instance</li>
<li>Better internal instance manager</li>
</ul>
</description>
</release>
<release version="2.5.1" date="2024-10-09">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.5.1</url>
<description>
<p>New</p>
<ul>
<li>Added 'Cancel' and 'Save' buttons when editing a message</li>
</ul>
<p>Fixes</p>
<ul>
<li>Better handling of image recognition</li>
<li>Remove unused files when canceling a model download</li>
<li>Better message blocks rendering</li>
</ul>
</description>
</release>
<release version="2.5.0" date="2024-10-06">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.5.0</url>
<description>
<p>New</p>
<ul>
<li>Run bash and python scripts straight from chat</li>
<li>Updated Ollama to 0.3.12</li>
<li>New models!</li>
</ul>
<p>Fixes</p>
<ul>
<li>Fixed and made faster the launch sequence</li>
<li>Better detection of code blocks in messages</li>
<li>Fixed app not loading in certain setups with Nvidia GPUs</li>
</ul>
</description>
</release>
<release version="2.0.6" date="2024-09-29">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.6</url>
<description>
<p>Fixes</p>
<ul>
<li>Fixed message notification sometimes crashing text rendering because of them running on different threads</li>
</ul>
</description>
</release>
<release version="2.0.5" date="2024-09-25">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.5</url>
<description>
<p>Fixes</p>
<ul>
<li>Fixed message generation sometimes failing</li>
</ul>
</description>
</release>
<release version="2.0.4" date="2024-09-22">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.4</url>
<description>
<p>New</p>
<ul>
<li>Sidebar resizes with the window</li>
<li>New welcome dialog</li>
<li>Message search</li>
<li>Updated Ollama to v0.3.11</li>
<li>A lot of new models provided by Ollama repository</li>
</ul>
<p>Fixes</p>
<ul>
<li>Fixed text inside model manager when the accessibility option 'large text' is on</li>
<li>Fixed image recognition on unsupported models</li>
</ul>
</description>
</release>
<release version="2.0.3" date="2024-09-18">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.3</url>
<description>
<p>Fixes</p>
<ul>
<li>Fixed spinner not hiding if the back end fails</li>
<li>Fixed image recognition with local images</li>
<li>Changed appearance of delete / stop model buttons</li>
<li>Fixed stop button crashing the app</li>
</ul>
<p>New</p>
<ul>
<li>Made sidebar resize a little when the window is smaller</li>
<li>Instant launch</li>
</ul>
</description>
</release>
<release version="2.0.2" date="2024-09-11">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.2</url>
<description>
<p>Fixes</p>
<ul>
<li>Fixed error on first run (welcome dialog)</li>
<li>Fixed checker for Ollama instance (used on system packages)</li>
</ul>
</description>
</release>
<release version="2.0.1" date="2024-09-11">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.1</url>
<description>
<p>Fixes</p>
<ul>
<li>Fixed 'clear chat' option</li>
<li>Fixed welcome dialog causing the local instance to not launch</li>
<li>Fixed support for AMD GPUs</li>
</ul>
</description>
</release>
<release version="2.0.0" date="2024-09-01">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/2.0.0</url>
<description>
<p>New</p>
<ul>
<li>Model, message and chat systems have been rewritten</li>
<li>New models are available</li>
<li>Ollama updated to v0.3.9</li>
<li>Added support for multiple chat generations simultaneously</li>
<li>Added experimental AMD GPU support</li>
<li>Added message loading spinner and new message indicator to chat tab</li>
<li>Added animations</li>
<li>Changed model manager / model selector appearance</li>
<li>Changed message appearance</li>
<li>Added markdown and code blocks to user messages</li>
<li>Added loading dialog at launch so the app opens faster</li>
<li>Added warning when device is on 'battery saver' mode</li>
<li>Added inactivity timer to integrated instance</li>
</ul>
<ul>
<li>The chat is now scrolled to the bottom when it's changed</li>
<li>Better handling of focus on messages</li>
<li>Better general performance on the app</li>
</ul>
</description>
</release>
<release version="1.1.1" date="2024-08-12">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.1.1</url>
<description>
<p>New</p>
<ul>
<li>New duplicate chat option</li>
<li>Changed model selector appearance</li>
<li>Message entry is focused on launch and chat change</li>
<li>Message is focused when it's being edited</li>
<li>Added loading spinner when regenerating a message</li>
<li>Added Ollama debugging to 'About Alpaca' dialog</li>
<li>Changed YouTube transcription dialog appearance and behavior</li>
</ul>
<p>Fixes</p>
<ul>
<li>CTRL+W and CTRL+Q stops local instance before closing the app</li>
<li>Changed appearance of 'Open Model Manager' button on welcome screen</li>
<li>Fixed message generation not working consistently</li>
<li>Fixed message edition not working consistently</li>
</ul>
</description>
</release>
<release version="1.1.0" date="2024-08-10">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.1.0</url>
<description>
<p>New</p>
<ul>
<li>Model manager opens faster</li>
<li>Delete chat option in secondary menu</li>
<li>New model selector popup</li>
<li>Standard shortcuts</li>
<li>Model manager is navigable with keyboard</li>
<li>Changed sidebar collapsing behavior</li>
<li>Focus indicators on messages</li>
<li>Welcome screen</li>
<li>Give message entry focus at launch</li>
<li>Generally better code</li>
</ul>
<p>Fixes</p>
<ul>
<li>Better width for dialogs</li>
<li>Better compatibility with screen readers</li>
<li>Fixed message regenerator</li>
<li>Removed 'Featured models' from welcome dialog</li>
<li>Added default buttons to dialogs</li>
<li>Fixed import / export of chats</li>
<li>Changed Python2 title to Python on code blocks</li>
<li>Prevent regeneration of title when the user changed it to a custom title</li>
<li>Show date on stopped messages</li>
<li>Fix clear chat error</li>
</ul>
</description>
</release>
<release version="1.0.6" date="2024-08-04">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.0.6</url>
<description>
<p>New</p>
<ul>
<li>Changed shortcuts to standards</li>
<li>Moved 'Manage Models' button to primary menu</li>
<li>Stable support for GGUF model files</li>
<li>General optimizations</li>
</ul>
<p>Fixes</p>
<ul>
<li>Better handling of enter key (important for Japanese input)</li>
<li>Removed sponsor dialog</li>
<li>Added sponsor link in about dialog</li>
<li>Changed window and elements dimensions</li>
<li>Selected model changes when entering model manager</li>
<li>Better image tooltips</li>
<li>GGUF Support</li>
</ul>
</description>
</release>
<release version="1.0.5" date="2024-08-02">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.0.5</url>
<description>
<p>New</p>
<ul>
<li>Regenerate any response, even if they are incomplete</li>
<li>Support for pulling models by name:tag</li>
<li>Stable support for GGUF model files</li>
<li>Restored sidebar toggle button</li>
</ul>
<p>Fixes</p>
<ul>
<li>Reverted back to standard styles</li>
<li>Fixed generated titles having "'S" for some reason</li>
<li>Changed min width for model dropdown</li>
<li>Changed message entry shadow</li>
<li>The last model used is now restored when the user changes chat</li>
<li>Better check for message finishing</li>
</ul>
</description>
</release>
<release version="1.0.4" date="2024-08-01">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.0.4</url>
<description>
<p>New</p>
<ul>
<li>Added table rendering (Thanks Nokse)</li>
</ul>
<p>Fixes</p>
<ul>
<li>Made support dialog more common</li>
<li>Dialog title on tag chooser when downloading models didn't display properly</li>
<li>Prevent chat generation from generating a title with multiple lines</li>
</ul>
</description>
</release>
<release version="1.0.3" date="2024-08-01">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.0.3</url>
<description>
<p>New</p>
<ul>
<li>Bearer Token entry on connection error dialog</li>
<li>Small appearance changes</li>
<li>Compatibility with code blocks without explicit language</li>
<li>Rare, optional and dismissible support dialog</li>
</ul>
<p>Fixes</p>
<ul>
<li>Date format for Simplified Chinese translation</li>
<li>Bug with unsupported localizations</li>
<li>Min height being too large to be used on mobile</li>
<li>Remote connection checker bug</li>
</ul>
</description>
</release>
<release version="1.0.2" date="2024-07-29">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.0.2</url>
<description>
<p>Fixes</p>
<ul>
<li>Models with capital letters on their tag don't work</li>
<li>Ollama fails to launch on some systems</li>
<li>YouTube transcripts are not being saved in the right TMP directory</li>
</ul>
<p>New</p>
<ul>
<li>Debug messages are now shown on the 'About Alpaca' dialog</li>
<li>Updated Ollama to v0.3.0 (new models)</li>
</ul>
</description>
</release>
<release version="1.0.1" date="2024-07-23">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.0.1</url>
<description>
<p>Fixes</p>
<ul>
<li>Models with '-' in their names didn't work properly, this is now fixed</li>
<li>Better connection check for Ollama</li>
</ul>
</description>
</release>
<release version="1.0.0" date="2024-07-22">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/1.0.0</url>
<description>
<p>Stable Release</p>
<p>The new icon was made by Tobias Bernard over the Gnome Gitlab, thanks for the great icon!</p>
<p>Features and fixes</p>
<ul>
<li>Updated Ollama instance to 0.2.8</li>
<li>Better model selector</li>
<li>Model manager redesign</li>
<li>Better tag selector when pulling a model</li>
<li>Model search</li>
<li>Added support for bearer tokens on remote instances</li>
<li>Preferences dialog redesign</li>
<li>Added context menus to interact with a chat</li>
<li>Redesigned primary and secondary menus</li>
<li>YouTube integration: Paste the URL of a video with a transcript and it will be added to the prompt</li>
<li>Website integration (Experimental): Extract the text from the body of a website by adding it's URL to the prompt</li>
<li>Chat title generation</li>
<li>Auto resizing of message entry</li>
<li>Chat notifications</li>
<li>Added indicator when an image is missing</li>
<li>Auto rearrange the order of chats when a message is received</li>
<li>Redesigned file preview dialog</li>
<li>Credited new contributors</li>
<li>Better stability and optimization</li>
<li>Edit messages to change the context of a conversation</li>
<li>Added disclaimers when pulling models</li>
<li>Preview files before sending a message</li>
<li>Better format for date and time on messages</li>
<li>Error and debug logging on terminal</li>
<li>Auto-hiding sidebar button</li>
<li>Various UI tweaks</li>
</ul>
<p>New Models</p>
<ul>
<li>Gemma2</li>
<li>GLM4</li>
<li>Codegeex4</li>
<li>InternLM2</li>
<li>Llama3-groq-tool-use</li>
<li>Mathstral</li>
<li>Mistral-nemo</li>
<li>Firefunction-v2</li>
<li>Nuextract</li>
</ul>
<p>Translations</p>
<p>These are all the available translations on 1.0.0, thanks to all the contributors!</p>
<ul>
<li>Russian: Alex K</li>
<li>Spanish: Jeffser</li>
<li>Brazilian Portuguese: Daimar Stein</li>
<li>French: Louis Chauvet-Villaret</li>
<li>Norwegian: CounterFlow64</li>
<li>Bengali: Aritra Saha</li>
<li>Simplified Chinese: Yuehao Sui</li>
</ul>
</description>
</release>
<release version="0.9.6.1" date="2024-06-22">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.6.1</url>
<description>
<p>Fix</p>
<p>Removed DOCX compatibility temporally due to error with python-lxml dependency </p>
</description>
</release>
<release version="0.9.6" date="2024-06-21">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.6</url>
<description>
<p>Big Update</p>
<ul>
<li>Added compatibility for PDF</li>
<li>Added compatibility for DOCX</li>
<li>Merged 'file attachment' menu into one button</li>
</ul>
</description>
</release>
<release version="0.9.5" date="2024-06-04">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.5</url>
<description>
<p>Quick Fix</p>
<p>There were some errors when transitioning from the old version of chats to the new version. I apologize if this caused any corruption in your chat history. This should be the only time such a transition is needed.</p>
</description>
</release>
<release version="0.9.4" date="2024-06-04">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.4</url>
<description>
<p>Huge Update</p>
<ul>
<li>Added: Support for plain text files</li>
<li>Added: New backend system for storing messages</li>
<li>Added: Support for changing Ollama's overrides</li>
<li>General Optimization</li>
</ul>
</description>
</release>
<release version="0.9.3" date="2024-06-01">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.3</url>
<description>
<p>Big Update</p>
<ul>
<li>Added: Support for GGUF models (experimental)</li>
<li>Added: Support for customization and creation of models</li>
<li>Fixed: Icons don't appear on non Gnome systems</li>
<li>Update Ollama to v0.1.39</li>
</ul>
</description>
</release>
<release version="0.9.2" date="2024-05-30">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.2</url>
<description>
<p>Fix</p>
<ul>
<li>Fixed: app didn't open if models tweaks wasn't present in the config files</li>
</ul>
</description>
</release>
<release version="0.9.1" date="2024-05-29">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.1</url>
<description>
<p>Big Update</p>
<ul>
<li>Changed multiple icons (paper airplane for the send button)</li>
<li>Combined export / import chat buttons into a menu</li>
<li>Added 'model tweaks' (temperature, seed, keep_alive)</li>
<li>Fixed send / stop button</li>
<li>Fixed app not checking if remote connection works when starting</li>
</ul>
</description>
</release>
<release version="0.9.0" date="2024-05-29">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.9.0</url>
<description>
<p>Daily Update</p>
<ul>
<li>Added text ellipsis to chat name so it doesn't change the button width</li>
<li>New shortcut for creating a chat (CTRL+N)</li>
<li>New message entry design</li>
<li>Fixed: Can't rename the same chat multiple times</li>
</ul>
</description>
</release>
<release version="0.8.8" date="2024-05-28">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.8.8</url>
<description>
<p>The fix</p>
<ul>
<li>Fixed: Ollama instance keeps running on the background even when it is disabled</li>
<li>Fixed: Can't pull models on the integrated instance</li>
</ul>
</description>
</release>
<release version="0.8.7" date="2024-05-27">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.8.7</url>
<description>
<p>Quick tweaks</p>
<ul>
<li>Added progress bar to models that are being pulled</li>
<li>Added size to tags when pulling a model</li>
<li>General optimizations on the background</li>
</ul>
</description>
</release>
<release version="0.8.6" date="2024-05-26">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.8.6</url>
<description>
<p>Quick fixes</p>
<ul>
<li>Fixed: Scroll when message is received</li>
<li>Fixed: Content doesn't change when creating a new chat</li>
<li>Added 'Featured Models' page on welcome dialog</li>
</ul>
</description>
</release>
<release version="0.8.5" date="2024-05-26">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.8.5</url>
<description>
<p>Nice Update</p>
<ul>
<li>UI tweaks (Thanks Nokse22)</li>
<li>General optimizations</li>
<li>Metadata fixes</li>
</ul>
</description>
</release>
<release version="0.8.1" date="2024-05-24">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.8.1</url>
<description>
<p>Quick fix</p>
<ul>
<li>Updated Spanish translation</li>
<li>Added compatibility for PNG</li>
</ul>
</description>
</release>
<release version="0.8.0" date="2024-05-24">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.8.0</url>
<description>
<p>New Update</p>
<ul>
<li>Updated model list</li>
<li>Added image recognition to more models</li>
<li>Added Brazilian Portuguese translation (Thanks Daimaar Stein)</li>
<li>Refined the general UI (Thanks Nokse22)</li>
<li>Added 'delete message' feature</li>
<li>Added metadata so that software distributors know that the app is compatible with mobile</li>
<li>Changed 'send' shortcut to just the return/enter key (to add a new line use shift+return)</li>
</ul>
</description>
</release>
<release version="0.7.1" date="2024-05-23">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.7.1</url>
<description>
<p>Bug Fixes</p>
<ul>
<li>Fixed: Minor spelling mistake</li>
<li>Added 'mobile' as a supported form factor</li>
<li>Fixed: 'Connection Error' dialog not working properly</li>
<li>Fixed: App might freeze randomly on startup</li>
<li>Changed 'chats' label on sidebar for 'Alpaca'</li>
</ul>
</description>
</release>
<release version="0.7.0" date="2024-05-22">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.7.0</url>
<description>
<p>Cool Update</p>
<ul>
<li>Better design for chat window</li>
<li>Better design for chat sidebar</li>
<li>Fixed remote connections</li>
<li>Fixed Ollama restarting in loop</li>
<li>Other cool backend stuff</li>
</ul>
</description>
</release>
<release version="0.6.0" date="2024-05-21">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.6.0</url>
<description>
<p>Huge Update</p>
<ul>
<li>Added Ollama as part of Alpaca, Ollama will run in a sandbox</li>
<li>Added option to connect to remote instances (how it worked before)</li>
<li>Added option to import and export chats</li>
<li>Added option to run Alpaca with Ollama in the background</li>
<li>Added preferences dialog</li>
<li>Changed the welcome dialog</li>
</ul>
<p>
Please report any errors to the issues page, thank you.
</p>
</description>
</release>
<release version="0.5.5" date="2024-05-20">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.5.5</url>
<description>
<p>Yet Another Daily Update</p>
<ul>
<li>Added better UI for 'Manage Models' dialog</li>
<li>Added better UI for the chat sidebar</li>
<li>Replaced model description with a button to open Ollama's website for the model</li>
<li>Added myself to the credits as the spanish translator</li>
<li>Using XDG properly to get config folder</li>
<li>Update for translations</li>
</ul>
<p>
Please report any errors to the issues page, thank you.
</p>
</description>
</release>
<release version="0.5.2" date="2024-05-19">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.5.2</url>
<description>
<p>Quick Fix</p>
<ul>
<li>The last update had some mistakes in the description of the update</li>
</ul>
<p>
Please report any errors to the issues page, thank you.
</p>
</description>
</release>
<release version="0.5.1" date="2024-05-19">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.5.1</url>
<description>
<p>Another Daily Update</p>
<ul>
<li>Added full Spanish translation</li>
<li>Added support for background pulling of multiple models</li>
<li>Added interrupt button</li>
<li>Added basic shortcuts</li>
<li>Better translation support</li>
<li>User can now leave chat name empty when creating a new one, it will add a placeholder name</li>
<li>Better scalling for different window sizes</li>
<li>Fixed: Can't close app if first time setup fails</li>
</ul>
<p>
Please report any errors to the issues page, thank you.
</p>
</description>
</release>
<release version="0.5.0" date="2024-05-19">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.5.0</url>
<description>
<p>Really Big Update</p>
<ul>
<li>Added multiple chats support!</li>
<li>Added Pango Markup support (bold, list, title, subtitle, monospace)</li>
<li>Added autoscroll if the user is at the bottom of the chat</li>
<li>Added support for multiple tags on a single model</li>
<li>Added better model management dialog</li>
<li>Added loading spinner when sending message</li>
<li>Added notifications if app is not active and a model pull finishes</li>
<li>Added new symbolic icon</li>
<li>Added frame to message textview widget</li>
<li>Fixed "code blocks shouldn't be editable"</li>
</ul>
<p>
Please report any errors to the issues page, thank you.
</p>
</description>
</release>
<release version="0.4.0" date="2024-05-17">
<url type="details">https://github.com/Jeffser/Alpaca/releases/tag/0.4.0</url>
<description>

View File

@ -1,677 +0,0 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
inkscape:export-ydpi="96"
inkscape:export-xdpi="96"
inkscape:export-filename="Template.png"
width="192"
height="152"
id="svg11300"
sodipodi:version="0.32"
inkscape:version="1.3.2 (091e20ef0f, 2023-11-25)"
sodipodi:docname="com.jeffser.Alpaca.Source.svg"
inkscape:output_extension="org.inkscape.output.svg.inkscape"
version="1.0"
style="display:inline;enable-background:new"
viewBox="0 0 192 152"
xml:space="preserve"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:dc="http://purl.org/dc/elements/1.1/"><title
id="title4162">Adwaita Icon Template</title><defs
id="defs3"><linearGradient
id="linearGradient74"
inkscape:collect="always"><stop
style="stop-color:#b6d1f2;stop-opacity:1"
offset="0"
id="stop73" /><stop
style="stop-color:#e9eef4;stop-opacity:1"
offset="1"
id="stop74" /></linearGradient><inkscape:path-effect
effect="fillet_chamfer"
id="path-effect10"
is_visible="true"
lpeversion="1"
nodesatellites_param="F,0,0,1,0,0,0,1 @ F,0,0,1,0,0,0,1 @ F,0,0,1,0,1.0001776,0,1 @ F,0,0,1,0,1.9999911,0,1 @ F,0,1,1,0,0,0,1"
radius="4"
unit="px"
method="auto"
mode="F"
chamfer_steps="1"
flexible="false"
use_knot_distance="true"
apply_no_radius="true"
apply_with_radius="true"
only_selected="false"
hide_knots="false" /><inkscape:path-effect
effect="fillet_chamfer"
id="path-effect203"
is_visible="true"
lpeversion="1"
nodesatellites_param="F,0,0,1,0,0,0,1 @ F,0,0,1,0,0,0,1 @ F,0,0,1,0,1.0001776,0,1 @ F,0,0,1,0,1.9999911,0,1 @ F,0,1,1,0,0,0,1"
radius="4"
unit="px"
method="auto"
mode="F"
chamfer_steps="1"
flexible="false"
use_knot_distance="true"
apply_no_radius="true"
apply_with_radius="true"
only_selected="false"
hide_knots="false" /><linearGradient
inkscape:collect="always"
xlink:href="#linearGradient74"
id="linearGradient243"
gradientUnits="userSpaceOnUse"
x1="48"
y1="260"
x2="48"
y2="220" /></defs><sodipodi:namedview
stroke="#ef2929"
fill="#f57900"
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="0.25490196"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="2.3650602"
inkscape:cx="28.5405"
inkscape:cy="77.376467"
inkscape:current-layer="layer9"
showgrid="false"
inkscape:grid-bbox="true"
inkscape:document-units="px"
inkscape:showpageshadow="false"
inkscape:window-width="1920"
inkscape:window-height="1011"
inkscape:window-x="0"
inkscape:window-y="0"
width="400px"
height="300px"
inkscape:snap-nodes="true"
inkscape:snap-bbox="true"
objecttolerance="7"
gridtolerance="12"
guidetolerance="13"
inkscape:window-maximized="1"
inkscape:pagecheckerboard="false"
showguides="false"
inkscape:guide-bbox="true"
inkscape:locked="false"
inkscape:measure-start="0,0"
inkscape:measure-end="0,0"
inkscape:object-nodes="true"
inkscape:bbox-nodes="true"
inkscape:snap-global="true"
inkscape:object-paths="true"
inkscape:snap-intersection-paths="true"
inkscape:snap-bbox-edge-midpoints="true"
inkscape:snap-bbox-midpoints="true"
showborder="true"
inkscape:snap-center="true"
inkscape:snap-object-midpoints="true"
inkscape:snap-midpoints="true"
inkscape:snap-smooth-nodes="true"
inkscape:snap-text-baseline="true"
borderlayer="true"
inkscape:deskcolor="#d1d1d1"><inkscape:grid
type="xygrid"
id="grid5883"
spacingx="2"
spacingy="2"
enabled="true"
visible="false"
empspacing="4"
originx="8"
originy="8"
units="px" /><sodipodi:guide
position="72,16"
orientation="0,1"
id="guide1073"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="20,72"
orientation="1,0"
id="guide1075"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,112"
orientation="0,1"
id="guide1099"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,136"
orientation="0,1"
id="guide993"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="112,72"
orientation="1,0"
id="guide995"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="8.0000001,72"
orientation="1,0"
id="guide867"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="128,72"
orientation="1,0"
id="guide869"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,124"
orientation="0,1"
id="guide871"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><inkscape:grid
type="xygrid"
id="grid873"
spacingx="1"
spacingy="1"
empspacing="8"
color="#000000"
opacity="0.49019608"
empcolor="#000000"
empopacity="0.08627451"
dotted="true"
originx="8"
originy="8"
units="px"
visible="false" /><sodipodi:guide
position="32,72"
orientation="1,0"
id="guide877"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="124,72"
orientation="1,0"
id="guide879"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,128"
orientation="0,1"
id="guide881"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,20"
orientation="0,1"
id="guide883"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="16,72"
orientation="1,0"
id="guide885"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="136,72"
orientation="1,0"
id="guide887"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,8"
orientation="0,1"
id="guide897"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,32"
orientation="0,1"
id="guide899"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="264,264"
orientation="-0.70710678,0.70710678"
id="guide950"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /><sodipodi:guide
position="72,72"
orientation="0.70710678,0.70710678"
id="guide952"
inkscape:locked="false"
inkscape:label=""
inkscape:color="rgb(0,0,255)" /></sodipodi:namedview><metadata
id="metadata4"><rdf:RDF><cc:Work
rdf:about=""><dc:format>image/svg+xml</dc:format><dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" /><dc:creator><cc:Agent><dc:title>GNOME Design Team</dc:title></cc:Agent></dc:creator><dc:source /><cc:license
rdf:resource="http://creativecommons.org/licenses/by-sa/4.0/" /><dc:title>Adwaita Icon Template</dc:title><dc:subject><rdf:Bag /></dc:subject><dc:date /><dc:rights><cc:Agent><dc:title /></cc:Agent></dc:rights><dc:publisher><cc:Agent><dc:title /></cc:Agent></dc:publisher><dc:identifier /><dc:relation /><dc:language /><dc:coverage /><dc:description /><dc:contributor><cc:Agent><dc:title /></cc:Agent></dc:contributor></cc:Work><cc:License
rdf:about="http://creativecommons.org/licenses/by-sa/4.0/"><cc:permits
rdf:resource="http://creativecommons.org/ns#Reproduction" /><cc:permits
rdf:resource="http://creativecommons.org/ns#Distribution" /><cc:requires
rdf:resource="http://creativecommons.org/ns#Notice" /><cc:requires
rdf:resource="http://creativecommons.org/ns#Attribution" /><cc:permits
rdf:resource="http://creativecommons.org/ns#DerivativeWorks" /><cc:requires
rdf:resource="http://creativecommons.org/ns#ShareAlike" /></cc:License></rdf:RDF></metadata><g
id="layer1"
inkscape:label="App Icon"
inkscape:groupmode="layer"
style="display:inline"
transform="translate(8,-156)"><g
inkscape:groupmode="layer"
id="layer4"
inkscape:label="template"
style="display:inline"
sodipodi:insensitive="true"><rect
inkscape:label="0"
y="172"
x="9.2651362e-08"
height="128"
width="128"
id="hicolor"
style="display:inline;overflow:visible;visibility:visible;fill:#f0f0f0;fill-opacity:0;fill-rule:nonzero;stroke:none;stroke-width:0.5;marker:none;enable-background:accumulate" /><rect
style="display:inline;overflow:visible;visibility:visible;fill:#f0f0f0;fill-opacity:0;fill-rule:nonzero;stroke:none;stroke-width:0.5;marker:none;enable-background:accumulate"
id="symbolic"
width="16"
height="16"
x="160"
y="172"
inkscape:label="0" /></g><g
inkscape:groupmode="layer"
id="layer2"
inkscape:label="baseplate"
style="display:none"
sodipodi:insensitive="true"><g
style="display:inline;fill:#000000;enable-background:new"
transform="matrix(7.9911709,0,0,8.0036407,-167.7909,-4846.0776)"
id="g12027"
inkscape:export-xdpi="12"
inkscape:export-ydpi="12" /><rect
style="display:inline;overflow:visible;visibility:visible;fill:#f0f0f0;fill-opacity:1;fill-rule:nonzero;stroke:none;stroke-width:0.5;marker:none;enable-background:accumulate"
id="128"
width="128"
height="128"
x="9.2651362e-08"
y="172"
inkscape:label="0" /><g
id="g883"
style="fill:none;fill-opacity:0.25098;stroke:#a579b3;stroke-opacity:1"
transform="translate(-24,24)" /><g
id="g900"
style="fill:none;fill-opacity:0.25098;stroke:#a579b3;stroke-opacity:1"
transform="translate(-24,24)" /><rect
inkscape:label=""
y="172"
x="160"
height="16"
width="16"
id="16"
style="display:inline;overflow:visible;visibility:visible;fill:#f0f0f0;fill-opacity:1;fill-rule:nonzero;stroke:none;stroke-width:0.5;marker:none;enable-background:accumulate" /><text
xml:space="preserve"
style="font-style:normal;font-variant:normal;font-weight:bold;font-stretch:normal;font-size:4px;line-height:125%;font-family:Cantarell;-inkscape-font-specification:'Cantarell, Bold';text-align:start;writing-mode:lr-tb;text-anchor:start;display:inline;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0.332649;enable-background:new"
x="0"
y="167"
id="text863"
inkscape:label="icon-name"><tspan
style="font-size:4px;stroke-width:0.332649"
sodipodi:role="line"
id="tspan861"
x="0"
y="167">Hicolor</tspan></text><text
inkscape:label="icon-name"
id="text867"
y="167"
x="160"
style="font-style:normal;font-variant:normal;font-weight:bold;font-stretch:normal;font-size:4px;line-height:125%;font-family:Cantarell;-inkscape-font-specification:'Cantarell, Bold';text-align:start;writing-mode:lr-tb;text-anchor:start;display:inline;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0.332649;enable-background:new"
xml:space="preserve"><tspan
y="167"
x="160"
id="tspan865"
sodipodi:role="line"
style="font-size:4px;stroke-width:0.332649">Symbolic</tspan></text></g><g
inkscape:groupmode="layer"
id="layer9"
inkscape:label="icons"
style="display:inline"><path
sodipodi:type="star"
style="display:none;fill:none;fill-opacity:1;stroke:#000000;stroke-width:0.377953;stroke-linecap:round;stroke-linejoin:round;stroke-dasharray:none;stroke-opacity:1;paint-order:fill markers stroke"
id="path7608"
inkscape:flatsided="true"
sodipodi:sides="6"
sodipodi:cx="88"
sodipodi:cy="80"
sodipodi:r1="24"
sodipodi:r2="22.173109"
sodipodi:arg1="-1.5707963"
sodipodi:arg2="-1.0471975"
inkscape:rounded="-3.469447e-18"
inkscape:randomized="0"
d="m 88.000001,56 20.784609,12.000001 0,24 L 87.999999,104 67.21539,91.999999 l 10e-7,-24 z"
transform="matrix(1.0672586,0,0,1.0932338,-15.01003,148.2195)" /><g
id="g157"
transform="matrix(0.37500254,0,0,0.37500254,131.62469,107.4995)"
style="display:inline;fill:#241f31;enable-background:new" /><g
id="g158"
transform="translate(-70)" /><g
id="g162"
style="display:inline;fill:#241f31;fill-opacity:1;stroke:#241f31;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;enable-background:new"
transform="matrix(0.37500254,0,0,0.37500254,121.49966,106.7495)" /><path
style="color:#000000;fill:#241f31;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;-inkscape-stroke:none"
d="m 172.5,175 a 1,1 0 0 0 -1,1 1,1 0 0 0 1,1 c 0.28799,0 0.5,0.21201 0.5,0.5 a 1,1 0 0 0 1,1 1,1 0 0 0 1,-1 c 0,-1.36887 -1.13113,-2.5 -2.5,-2.5 z"
id="path178" /><path
style="color:#000000;fill:#241f31;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;-inkscape-stroke:none"
d="m 162,175 a 1,1 0 0 0 -1,1 v 11 a 1,1 0 0 0 1,1 1,1 0 0 0 1,-1 v -11 a 1,1 0 0 0 -1,-1 z m 10,6 a 1,1 0 0 0 -1,1 v 5 a 1,1 0 0 0 1,1 1,1 0 0 0 1,-1 v -5 a 1,1 0 0 0 -1,-1 z"
id="path164" /><path
style="color:#000000;fill:#241f31;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;-inkscape-stroke:none"
d="m 165,172 c -2.19729,0 -4,1.80271 -4,4 a 1,1 0 0 0 1,1 1,1 0 0 0 1,-1 c 0,-0.72279 0.43588,-1.23883 1,-1.58984 V 176 a 1,1 0 0 0 1,1 1,1 0 0 0 1,-1 v -3 a 1.0001,1.0001 0 0 0 -1,-1 z"
id="path19-0" /><path
style="color:#000000;fill:#241f31;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;-inkscape-stroke:none"
d="m 170,172 c -2.19729,0 -4,1.80271 -4,4 a 1,1 0 0 0 1,1 1,1 0 0 0 1,-1 c 0,-0.72279 0.43588,-1.23883 1,-1.58984 V 176 a 1,1 0 0 0 1,1 1,1 0 0 0 1,-1 v -3 a 1.0001,1.0001 0 0 0 -1,-1 z"
id="path166" /><path
style="color:#000000;fill:#241f31;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;-inkscape-stroke:none"
d="m 165,175 a 1,1 0 0 0 -1,1 1,1 0 0 0 1,1 h 2 a 1,1 0 0 0 1,-1 1,1 0 0 0 -1,-1 z"
id="path167" /><path
style="color:#000000;fill:#241f31;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;-inkscape-stroke:none"
d="M 170.00586,175 A 1,1 0 0 0 169,175.99414 1,1 0 0 0 169.99414,177 l 2.52539,0.0156 a 1,1 0 0 0 1.00586,-0.99414 1,1 0 0 0 -0.99414,-1.00586 z"
id="path168" /><path
style="color:#000000;fill:#241f31;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;-inkscape-stroke:none"
d="m 174.01953,176.47461 a 1,1 0 0 0 -1.00391,0.99609 L 173,180.99609 A 1,1 0 0 0 173.99609,182 1,1 0 0 0 175,181.00391 l 0.0156,-3.52539 a 1,1 0 0 0 -0.99609,-1.00391 z"
id="path179" /><path
id="path192"
style="display:inline;fill:#241f31;fill-opacity:1;stroke:none;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:none;stroke-opacity:1;enable-background:new"
d="m 176,180 v 2 c -3.53529,0.2145 -4.23624,2.44366 -5.64991,3.08535 -0.49735,0.22576 -1.42294,0.16467 -1.97533,0.16465 l -0.37477,-1e-5 A 2.0000622,2.0000622 45.001019 0 1 166,183.24993 V 180"
sodipodi:nodetypes="ccccc"
inkscape:path-effect="#path-effect203"
inkscape:original-d="m 176,180 v 2 c -4.25068,0.2579 -4.40387,3.42834 -6.62506,3.25004 L 166,185.24992 V 180" /><g
id="g10"
transform="translate(50)"><g
id="g2"
transform="translate(-1,2)"><path
style="fill:none;fill-opacity:1;stroke:#241f31;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
id="path2"
sodipodi:type="arc"
sodipodi:cx="173.5"
sodipodi:cy="175.5"
sodipodi:rx="1.4999994"
sodipodi:ry="1.4999994"
sodipodi:start="4.712389"
sodipodi:end="0"
sodipodi:open="true"
sodipodi:arc-type="arc"
d="m 173.5,174 a 1.4999994,1.4999994 0 0 1 1.5,1.5" /></g><path
style="opacity:1;fill:none;fill-opacity:1;stroke:#241f31;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
d="m 162,176 v 11 m 10,0 v -5"
id="path3"
sodipodi:nodetypes="cccc" /><path
id="path4"
style="display:inline;fill:none;stroke:#241f31;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;enable-background:new"
d="m 162,176 c 0,-1.65685 1.34315,-3 3,-3 v 3"
sodipodi:nodetypes="ccc" /><path
id="path5"
style="display:inline;fill:none;stroke:#241f31;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;enable-background:new"
d="m 167,176 c 0,-1.65685 1.34315,-3 3,-3 v 3"
sodipodi:nodetypes="ccc" /><path
style="opacity:1;fill:none;fill-opacity:1;stroke:#241f31;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
d="m 165,176 h 2"
id="path6"
sodipodi:nodetypes="cc" /><path
style="opacity:1;fill:none;fill-opacity:1;stroke:#241f31;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
d="m 170,176 2.52556,0.0158"
id="path7"
sodipodi:nodetypes="cc" /><path
style="opacity:1;fill:none;fill-opacity:1;stroke:#241f31;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
d="M 174.0158,177.47444 174,181"
id="path8"
sodipodi:nodetypes="cc" /><path
id="path9"
style="display:inline;fill:#241f31;fill-opacity:1;stroke:none;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:none;stroke-opacity:1;enable-background:new"
d="m 176,180 v 2 c -3.53529,0.2145 -4.23624,2.44366 -5.64991,3.08535 -0.49735,0.22576 -1.42294,0.16467 -1.97533,0.16465 l -0.37477,-1e-5 A 2.0000622,2.0000622 45.001019 0 1 166,183.24993 V 180"
sodipodi:nodetypes="ccccc"
inkscape:path-effect="#path-effect10"
inkscape:original-d="m 176,180 v 2 c -4.25068,0.2579 -4.40387,3.42834 -6.62506,3.25004 L 166,185.24992 V 180" /></g><g
id="g243"><g
id="g224"
style="fill:#99c1f1;fill-opacity:1"><g
id="g223"
style="fill:#99c1f1;fill-opacity:1"><rect
style="opacity:1;fill:#99c1f1;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect222"
width="24"
height="12"
x="2.000001"
y="238" /><rect
style="opacity:1;fill:#99c1f1;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect223"
width="24"
height="22"
x="2.000001"
y="228"
rx="10.323593"
ry="10.323593" /></g></g><rect
style="opacity:1;fill:#5e5c64;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect224"
width="8.0000019"
height="23"
x="82"
y="273" /><rect
style="opacity:1;fill:#5e5c64;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect225"
width="8.0000019"
height="23"
x="100"
y="273" /><rect
style="opacity:1;fill:#5e5c64;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect226"
width="7.9999981"
height="23"
x="20.000002"
y="273" /><rect
style="opacity:1;fill:#5e5c64;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect227"
width="7.999999"
height="23"
x="38"
y="273" /><g
id="g229"
style="fill:#1c71d8"><rect
style="display:inline;fill:#6b9bd2;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12;enable-background:new"
id="rect228"
width="24"
height="38"
x="30"
y="252"
rx="10.323593"
ry="10.323593" /><rect
style="display:inline;fill:#6b9bd2;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12;enable-background:new"
id="rect229"
width="24"
height="38"
x="92"
y="252"
rx="10.323593"
ry="10.323593" /></g><g
id="g231"
style="fill:#3584e4"><rect
style="fill:#82adde;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect230"
width="24"
height="38"
x="30"
y="248"
rx="10.323593"
ry="10.323593" /><rect
style="fill:#82adde;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect231"
width="24"
height="38"
x="92"
y="248"
rx="10.323593"
ry="10.323593" /></g><g
id="g234"
style="fill:#62a0ea;fill-opacity:1"><rect
style="opacity:1;fill:#99c1f1;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect232"
width="24"
height="38"
x="12.000001"
y="252"
rx="10.323593"
ry="10.323593" /><rect
style="fill:#99c1f1;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect233"
width="24"
height="38"
x="74"
y="252"
rx="10.323593"
ry="10.323593" /><rect
style="opacity:1;fill:#99c1f1;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect234"
width="104.00044"
height="50"
x="12.000001"
y="230"
rx="16"
ry="16" /></g><rect
style="fill:#b6d1f2;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect235"
width="24"
height="38"
x="12.000001"
y="248"
rx="10.323593"
ry="10.323593" /><rect
style="fill:#b6d1f2;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect236"
width="24"
height="38"
x="74"
y="248"
rx="10.323593"
ry="10.323593" /><g
id="g237"
transform="translate(11)"
style="fill:#99c1f1;fill-opacity:1"><path
style="opacity:1;fill:#bbd6f6;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="path236"
sodipodi:type="arc"
sodipodi:cx="100.00001"
sodipodi:cy="180.00003"
sodipodi:rx="8"
sodipodi:ry="8"
sodipodi:start="1.5707963"
sodipodi:end="4.712389"
sodipodi:arc-type="arc"
d="m 100.00001,188.00003 a 8,8 0 0 1 -6.928206,-4 8,8 0 0 1 0,-8 8,8 0 0 1 6.928206,-4"
sodipodi:open="true" /><rect
style="opacity:1;fill:#99c1f1;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="rect237"
width="7.9999995"
height="16"
x="92"
y="180" /></g><rect
style="fill:#e9eef4;fill-opacity:1;stroke-width:1.99999;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4.00002, 12"
id="rect238"
width="25.999989"
height="14"
x="92.000427"
y="180"
ry="4"
rx="4" /><path
style="fill:#e9eef4;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="path238"
sodipodi:type="arc"
sodipodi:cx="100.00001"
sodipodi:cy="180.00003"
sodipodi:rx="8"
sodipodi:ry="8"
sodipodi:start="1.5707963"
sodipodi:end="4.712389"
sodipodi:arc-type="arc"
d="m 100.00001,188.00003 a 8,8 0 0 1 -6.928206,-4 8,8 0 0 1 0,-8 8,8 0 0 1 6.928206,-4"
sodipodi:open="true" /><g
id="g239"
style="fill:#e2eaf3;fill-opacity:1"><path
id="path239"
style="opacity:1;fill:url(#linearGradient243);fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
d="m 92,188 v 16 5.67578 C 92,215.39505 87.395036,220 81.675781,220 H 80 28 c -8.863991,0 -16,7.13601 -16,16 v 24 c 0,8.86399 7.136009,16 16,16 h 72 c 8.86399,0 16,-7.13601 16,-16 V 243.67578 236 188 Z" /><rect
style="fill:#e9eef4;fill-opacity:1;stroke-width:1.99999;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4.00002, 12"
id="rect239"
width="12.999992"
height="16"
x="92.000427"
y="180" /></g><path
id="path240"
style="opacity:1;fill:#5e5c64;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
d="m 104,186 h 21.99978 v 5.99969 C 114.66472,192.68742 114.92314,200.47547 109,200 l -5,-3.1e-4 z"
sodipodi:nodetypes="cccccc" /><g
id="g242"
style="fill:#e9eef4;fill-opacity:1"
transform="translate(0,-2)"><circle
style="opacity:1;fill:#e9eef4;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="circle240"
cx="115.50021"
cy="188"
r="2.5002136" /><circle
style="opacity:1;fill:#e9eef4;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="circle241"
cx="110.49979"
cy="188"
r="2.5002136" /><circle
style="opacity:1;fill:#e9eef4;fill-opacity:1;stroke-width:2;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:1.1;stroke-dasharray:4, 12"
id="circle242"
cx="105.50021"
cy="188"
r="2.5002136" /></g></g></g><g
inkscape:groupmode="layer"
id="layer3"
inkscape:label="grid"
style="display:none"
sodipodi:insensitive="true"><circle
cx="64.000031"
cy="236"
r="59.504131"
id="circle2892"
style="display:inline;opacity:0.1;vector-effect:none;fill:none;fill-opacity:1;stroke:#000000;stroke-width:0.99;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:0.99, 0.99;stroke-dashoffset:0;stroke-opacity:1;marker:none;marker-start:none;marker-mid:none;marker-end:none;paint-order:normal;enable-background:new" /><rect
ry="7.9292889"
rx="8.701004"
y="180.49496"
x="20.495007"
height="111.01005"
width="87.009987"
id="rect2894"
style="display:inline;opacity:0.1;vector-effect:none;fill:none;fill-opacity:1;stroke:#000000;stroke-width:0.99;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:0.99, 0.99;stroke-dashoffset:0;stroke-opacity:1;marker:none;marker-start:none;marker-mid:none;marker-end:none;paint-order:normal;enable-background:new" /><rect
ry="7.9238095"
rx="7.9238095"
y="184.49524"
x="12.495266"
height="103.00952"
width="103.00952"
id="rect2896"
style="display:inline;opacity:0.1;vector-effect:none;fill:none;fill-opacity:1;stroke:#000000;stroke-width:0.99;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:0.99, 0.99;stroke-dashoffset:0;stroke-opacity:1;marker:none;marker-start:none;marker-mid:none;marker-end:none;paint-order:normal;enable-background:new" /><rect
ry="8.701005"
rx="7.9292889"
y="200.49496"
x="8.4950066"
height="87.010048"
width="111.01004"
id="rect2898"
style="display:inline;opacity:0.1;vector-effect:none;fill:none;fill-opacity:1;stroke:#000000;stroke-width:0.99;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:0.99, 0.99;stroke-dashoffset:0;stroke-opacity:1;marker:none;marker-start:none;marker-mid:none;marker-end:none;paint-order:normal;enable-background:new" /><path
inkscape:connector-curvature="0"
id="path2900"
d="M 2.6203015e-5,288.99999 H 128.00003"
style="display:inline;fill:none;stroke:#62a0ea;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;enable-background:new" /></g></g></svg>

Before

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

After

Width:  |  Height:  |  Size: 24 KiB

View File

@ -1,101 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="128px" viewBox="0 0 128 128" width="128px" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<filter id="a" height="100%" width="100%" x="0%" y="0%">
<feColorMatrix color-interpolation-filters="sRGB" values="0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0"/>
</filter>
<linearGradient id="b" gradientUnits="userSpaceOnUse" x1="48" x2="48" y1="88" y2="48">
<stop offset="0" stop-color="#b6d1f2"/>
<stop offset="1" stop-color="#e9eef4"/>
</linearGradient>
<clipPath id="c">
<rect height="128" width="128"/>
</clipPath>
<clipPath id="d">
<rect height="128" width="128"/>
</clipPath>
<mask id="e">
<g filter="url(#a)">
<g clip-path="url(#d)" filter="url(#a)">
<g clip-path="url(#c)">
<path d="m 2 66 h 24 v 12 h -24 z m 0 0" fill="#99c1f1"/>
<path d="m 12.324219 56 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 1.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -1.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#99c1f1"/>
<path d="m 82 101 h 8 v 23 h -8 z m 0 0" fill="#5e5c64"/>
<path d="m 100 101 h 8 v 23 h -8 z m 0 0" fill="#5e5c64"/>
<path d="m 20 101 h 8 v 23 h -8 z m 0 0" fill="#5e5c64"/>
<path d="m 38 101 h 8 v 23 h -8 z m 0 0" fill="#5e5c64"/>
<path d="m 40.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#6b9bd2"/>
<path d="m 102.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#6b9bd2"/>
<path d="m 40.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#82adde"/>
<path d="m 102.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#82adde"/>
<path d="m 22.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#99c1f1"/>
<path d="m 84.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#99c1f1"/>
<path d="m 28 58 h 72 c 8.835938 0 16 7.164062 16 16 v 18 c 0 8.835938 -7.164062 16 -16 16 h -72 c -8.835938 0 -16 -7.164062 -16 -16 v -18 c 0 -8.835938 7.164062 -16 16 -16 z m 0 0" fill="#99c1f1"/>
<path d="m 22.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#b6d1f2"/>
<path d="m 84.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#b6d1f2"/>
<path d="m 111 16 c -2.859375 0 -5.5 -1.523438 -6.929688 -4 c -1.425781 -2.476562 -1.425781 -5.523438 0 -8 c 1.429688 -2.476562 4.070313 -4 6.929688 -4" fill="#bbd6f6"/>
<path d="m 103 8 h 8 v 16 h -8 z m 0 0" fill="#99c1f1"/>
<path d="m 96 8 h 18 c 2.210938 0 4 1.789062 4 4 v 6 c 0 2.210938 -1.789062 4 -4 4 h -18 c -2.207031 0 -4 -1.789062 -4 -4 v -6 c 0 -2.210938 1.792969 -4 4 -4 z m 0 0" fill="#e9eef4"/>
<path d="m 100 16 c -2.859375 0 -5.5 -1.523438 -6.929688 -4 c -1.425781 -2.476562 -1.425781 -5.523438 0 -8 c 1.429688 -2.476562 4.070313 -4 6.929688 -4" fill="#e9eef4"/>
<path d="m 92 16 v 21.675781 c 0 5.71875 -4.605469 10.324219 -10.324219 10.324219 h -53.675781 c -8.863281 0 -16 7.136719 -16 16 v 24 c 0 8.863281 7.136719 16 16 16 h 72 c 8.863281 0 16 -7.136719 16 -16 v -72 z m 0 0" fill="url(#b)"/>
<path d="m 92 8 h 13 v 16 h -13 z m 0 0" fill="#e9eef4"/>
<path d="m 104 14 h 22 v 6 c -11.335938 0.6875 -11.078125 8.476562 -17 8 h -5 z m 0 0" fill="#5e5c64"/>
<path d="m 118 14 c 0 1.378906 -1.117188 2.5 -2.5 2.5 c -1.378906 0 -2.5 -1.121094 -2.5 -2.5 s 1.121094 -2.5 2.5 -2.5 c 1.382812 0 2.5 1.121094 2.5 2.5 z m 0 0" fill="#e9eef4"/>
<path d="m 113 14 c 0 1.378906 -1.121094 2.5 -2.5 2.5 c -1.382812 0 -2.5 -1.121094 -2.5 -2.5 s 1.117188 -2.5 2.5 -2.5 c 1.378906 0 2.5 1.121094 2.5 2.5 z m 0 0" fill="#e9eef4"/>
<path d="m 108 14 c 0 1.378906 -1.117188 2.5 -2.5 2.5 c -1.378906 0 -2.5 -1.121094 -2.5 -2.5 s 1.121094 -2.5 2.5 -2.5 c 1.382812 0 2.5 1.121094 2.5 2.5 z m 0 0" fill="#e9eef4"/>
</g>
</g>
</g>
</mask>
<mask id="f">
<g filter="url(#a)">
<rect fill-opacity="0.8" height="184.32" width="184.32" x="-28.16" y="-28.16"/>
</g>
</mask>
<linearGradient id="g" gradientTransform="matrix(0 0.37 -0.98462 0 295.38501 -30.360001)" gradientUnits="userSpaceOnUse" x1="300" x2="428" y1="235" y2="235">
<stop offset="0" stop-color="#f9f06b"/>
<stop offset="1" stop-color="#f5c211"/>
</linearGradient>
<clipPath id="h">
<rect height="128" width="128"/>
</clipPath>
<clipPath id="i">
<rect height="128" width="128"/>
</clipPath>
<path d="m 2 66 h 24 v 12 h -24 z m 0 0" fill="#99c1f1"/>
<path d="m 12.324219 56 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 1.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -1.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#99c1f1"/>
<g fill="#5e5c64">
<path d="m 82 101 h 8 v 23 h -8 z m 0 0"/>
<path d="m 100 101 h 8 v 23 h -8 z m 0 0"/>
<path d="m 20 101 h 8 v 23 h -8 z m 0 0"/>
<path d="m 38 101 h 8 v 23 h -8 z m 0 0"/>
</g>
<path d="m 40.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#6b9bd2"/>
<path d="m 102.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#6b9bd2"/>
<path d="m 40.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#82adde"/>
<path d="m 102.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#82adde"/>
<path d="m 22.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#99c1f1"/>
<path d="m 84.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#99c1f1"/>
<path d="m 28 58 h 72 c 8.835938 0 16 7.164062 16 16 v 18 c 0 8.835938 -7.164062 16 -16 16 h -72 c -8.835938 0 -16 -7.164062 -16 -16 v -18 c 0 -8.835938 7.164062 -16 16 -16 z m 0 0" fill="#99c1f1"/>
<path d="m 22.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#b6d1f2"/>
<path d="m 84.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0" fill="#b6d1f2"/>
<path d="m 111 16 c -2.859375 0 -5.5 -1.523438 -6.929688 -4 c -1.425781 -2.476562 -1.425781 -5.523438 0 -8 c 1.429688 -2.476562 4.070313 -4 6.929688 -4" fill="#bbd6f6"/>
<path d="m 103 8 h 8 v 16 h -8 z m 0 0" fill="#99c1f1"/>
<path d="m 96 8 h 18 c 2.210938 0 4 1.789062 4 4 v 6 c 0 2.210938 -1.789062 4 -4 4 h -18 c -2.207031 0 -4 -1.789062 -4 -4 v -6 c 0 -2.210938 1.792969 -4 4 -4 z m 0 0" fill="#e9eef4"/>
<path d="m 100 16 c -2.859375 0 -5.5 -1.523438 -6.929688 -4 c -1.425781 -2.476562 -1.425781 -5.523438 0 -8 c 1.429688 -2.476562 4.070313 -4 6.929688 -4" fill="#e9eef4"/>
<path d="m 92 16 v 21.675781 c 0 5.71875 -4.605469 10.324219 -10.324219 10.324219 h -53.675781 c -8.863281 0 -16 7.136719 -16 16 v 24 c 0 8.863281 7.136719 16 16 16 h 72 c 8.863281 0 16 -7.136719 16 -16 v -72 z m 0 0" fill="url(#b)"/>
<path d="m 92 8 h 13 v 16 h -13 z m 0 0" fill="#e9eef4"/>
<path d="m 104 14 h 22 v 6 c -11.335938 0.6875 -11.078125 8.476562 -17 8 h -5 z m 0 0" fill="#5e5c64"/>
<path d="m 118 14 c 0 1.378906 -1.117188 2.5 -2.5 2.5 c -1.378906 0 -2.5 -1.121094 -2.5 -2.5 s 1.121094 -2.5 2.5 -2.5 c 1.382812 0 2.5 1.121094 2.5 2.5 z m 0 0" fill="#e9eef4"/>
<path d="m 113 14 c 0 1.378906 -1.121094 2.5 -2.5 2.5 c -1.382812 0 -2.5 -1.121094 -2.5 -2.5 s 1.117188 -2.5 2.5 -2.5 c 1.378906 0 2.5 1.121094 2.5 2.5 z m 0 0" fill="#e9eef4"/>
<path d="m 108 14 c 0 1.378906 -1.117188 2.5 -2.5 2.5 c -1.378906 0 -2.5 -1.121094 -2.5 -2.5 s 1.121094 -2.5 2.5 -2.5 c 1.382812 0 2.5 1.121094 2.5 2.5 z m 0 0" fill="#e9eef4"/>
<g mask="url(#e)">
<g clip-path="url(#i)">
<g mask="url(#f)">
<g clip-path="url(#h)">
<path d="m 128 80.640625 v 47.359375 h -128 v -47.359375 z m 0 0" fill="url(#g)"/>
<path d="m 13.308594 80.640625 l 47.355468 47.359375 h 21.214844 l -47.359375 -47.359375 z m 42.421875 0 l 47.363281 47.359375 h 21.214844 l -47.363282 -47.359375 z m 42.429687 0 l 29.839844 29.839844 v -21.210938 l -8.628906 -8.628906 z m -98.160156 7.90625 v 21.214844 l 18.238281 18.238281 h 21.214844 z m 0 0"/>
</g>
</g>
</g>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 12 KiB

View File

@ -1,150 +1,272 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
height="128px"
width="128"
height="128"
id="svg11300"
version="1.0"
style="display:inline;enable-background:new"
viewBox="0 0 128 128"
width="128px"
version="1.1"
id="svg26"
sodipodi:docname="com.jeffser.Alpaca.svg"
inkscape:version="1.3.2 (091e20ef0f, 2023-11-25)"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg">
xmlns:svg="http://www.w3.org/2000/svg"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:dc="http://purl.org/dc/elements/1.1/">
<title
id="title4162">Adwaita Icon Template</title>
<defs
id="defs26" />
<sodipodi:namedview
id="namedview26"
pagecolor="#ffffff"
bordercolor="#000000"
borderopacity="0.25"
inkscape:showpageshadow="2"
inkscape:pageopacity="0.0"
inkscape:pagecheckerboard="0"
inkscape:deskcolor="#d1d1d1"
inkscape:zoom="6.65625"
inkscape:cx="64"
inkscape:cy="64"
inkscape:window-width="1920"
inkscape:window-height="1011"
inkscape:window-x="0"
inkscape:window-y="0"
inkscape:window-maximized="1"
inkscape:current-layer="svg26" />
<linearGradient
id="a"
gradientUnits="userSpaceOnUse"
x1="48"
x2="48"
y1="88"
y2="48">
<stop
offset="0"
stop-color="#b6d1f2"
id="stop1" />
<stop
offset="1"
stop-color="#e9eef4"
id="stop2" />
</linearGradient>
<path
d="m 2 66 h 24 v 12 h -24 z m 0 0"
fill="#99c1f1"
id="path2" />
<path
d="m 12.324219 56 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 1.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -1.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#99c1f1"
id="path3" />
id="defs3">
<linearGradient
id="linearGradient35">
<stop
style="stop-color:#3d3846;stop-opacity:1;"
offset="0"
id="stop35" />
<stop
style="stop-color:#241f31;stop-opacity:1;"
offset="1"
id="stop36" />
</linearGradient>
<linearGradient
id="linearGradient33">
<stop
style="stop-color:#99c1ff;stop-opacity:1;"
offset="0"
id="stop33" />
<stop
style="stop-color:#62a0ea;stop-opacity:1;"
offset="1"
id="stop34" />
</linearGradient>
<linearGradient
id="linearGradient31">
<stop
style="stop-color:#deddda;stop-opacity:1;"
offset="0"
id="stop31" />
<stop
style="stop-color:#f6f5f4;stop-opacity:1;"
offset="1"
id="stop32" />
</linearGradient>
<linearGradient
id="linearGradient24">
<stop
style="stop-color:#f6f5f4;stop-opacity:1;"
offset="0"
id="stop24" />
<stop
style="stop-color:#deddda;stop-opacity:1;"
offset="1"
id="stop23" />
</linearGradient>
<linearGradient
id="linearGradient12">
<stop
style="stop-color:#000000;stop-opacity:0;"
offset="0"
id="stop12" />
<stop
style="stop-color:#000000;stop-opacity:0;"
offset="1"
id="stop13" />
</linearGradient>
<linearGradient
xlink:href="#linearGradient12"
id="linearGradient13"
x1="40.888428"
y1="205.03607"
x2="40.90649"
y2="212.09515"
gradientUnits="userSpaceOnUse" />
<linearGradient
xlink:href="#linearGradient24"
id="linearGradient23"
x1="40.888428"
y1="205.03607"
x2="40.90649"
y2="212.09515"
gradientUnits="userSpaceOnUse" />
<linearGradient
xlink:href="#linearGradient24"
id="linearGradient29"
gradientUnits="userSpaceOnUse"
x1="40.888428"
y1="205.03607"
x2="40.90649"
y2="212.09515"
gradientTransform="matrix(-1,0,0,1,127.14843,2.8384866e-4)" />
<linearGradient
xlink:href="#linearGradient12"
id="linearGradient30"
gradientUnits="userSpaceOnUse"
x1="40.888428"
y1="205.03607"
x2="40.90649"
y2="212.09515"
gradientTransform="matrix(-1,0,0,1,127.14843,2.8384866e-4)" />
<linearGradient
xlink:href="#linearGradient31"
id="linearGradient32"
x1="63.552597"
y1="214.19464"
x2="63.552597"
y2="241.24492"
gradientUnits="userSpaceOnUse" />
<linearGradient
xlink:href="#linearGradient33"
id="linearGradient34"
x1="64.683159"
y1="267.04626"
x2="64.895935"
y2="278.69958"
gradientUnits="userSpaceOnUse"
gradientTransform="matrix(0.99245101,0,0,1.1818075,0.48386604,-51.63542)" />
<linearGradient
xlink:href="#linearGradient35"
id="linearGradient36"
x1="45.111782"
y1="235.32567"
x2="45.111782"
y2="229.17581"
gradientUnits="userSpaceOnUse" />
<linearGradient
xlink:href="#linearGradient35"
id="linearGradient37"
gradientUnits="userSpaceOnUse"
x1="45.111782"
y1="235.32567"
x2="45.111782"
y2="229.17581"
gradientTransform="translate(36.957243,0.15686125)" />
</defs>
<metadata
id="metadata4">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:creator>
<cc:Agent>
<dc:title>GNOME Design Team</dc:title>
</cc:Agent>
</dc:creator>
<dc:source />
<cc:license
rdf:resource="http://creativecommons.org/licenses/by-sa/4.0/" />
<dc:title>Adwaita Icon Template</dc:title>
<dc:subject>
<rdf:Bag />
</dc:subject>
<dc:date />
<dc:rights>
<cc:Agent>
<dc:title />
</cc:Agent>
</dc:rights>
<dc:publisher>
<cc:Agent>
<dc:title />
</cc:Agent>
</dc:publisher>
<dc:identifier />
<dc:relation />
<dc:language />
<dc:coverage />
<dc:description />
<dc:contributor>
<cc:Agent>
<dc:title />
</cc:Agent>
</dc:contributor>
</cc:Work>
<cc:License
rdf:about="http://creativecommons.org/licenses/by-sa/4.0/">
<cc:permits
rdf:resource="http://creativecommons.org/ns#Reproduction" />
<cc:permits
rdf:resource="http://creativecommons.org/ns#Distribution" />
<cc:requires
rdf:resource="http://creativecommons.org/ns#Notice" />
<cc:requires
rdf:resource="http://creativecommons.org/ns#Attribution" />
<cc:permits
rdf:resource="http://creativecommons.org/ns#DerivativeWorks" />
<cc:requires
rdf:resource="http://creativecommons.org/ns#ShareAlike" />
</cc:License>
</rdf:RDF>
</metadata>
<g
fill="#5e5c64"
id="g7">
<path
d="m 82 101 h 8 v 23 h -8 z m 0 0"
id="path4" />
<path
d="m 100 101 h 8 v 23 h -8 z m 0 0"
id="path5" />
<path
d="m 20 101 h 8 v 23 h -8 z m 0 0"
id="path6" />
<path
d="m 38 101 h 8 v 23 h -8 z m 0 0"
id="path7" />
id="layer1"
style="display:inline"
transform="translate(0,-172)">
<g
id="layer4"
transform="matrix(1.2477821,0,0,1.2477821,-15.858054,-61.000241)">
<rect
style="display:inline;fill:url(#linearGradient32);fill-opacity:1;stroke:#000000;stroke-width:2.91036;stroke-dasharray:none;stroke-opacity:0;enable-background:new"
id="rect1"
width="81.276649"
height="56.210243"
x="23.361675"
y="213.42638"
ry="13.680508" />
<path
style="display:inline;fill:url(#linearGradient23);fill-opacity:1;fill-rule:evenodd;stroke:url(#linearGradient13);stroke-width:3;stroke-linecap:round;stroke-dasharray:none;stroke-opacity:0;enable-background:new"
d="m 32.848631,215.48926 c 0,0 0.309082,-16.43408 7.036621,-19.01123 6.727539,-2.57715 10.409179,18.9375 10.409179,18.9375"
id="path1" />
<path
style="display:inline;fill:url(#linearGradient29);fill-opacity:1;fill-rule:evenodd;stroke:url(#linearGradient30);stroke-width:3;stroke-linecap:round;stroke-dasharray:none;stroke-opacity:0;enable-background:new"
d="m 94.2998,215.48954 c 0,0 -0.309082,-16.43408 -7.036621,-19.01123 C 80.53564,193.90116 76.854,215.41581 76.854,215.41581"
id="path1-7" />
<circle
style="display:inline;fill:url(#linearGradient36);fill-opacity:1;stroke:none;stroke-width:3;stroke-linecap:round;stroke-dasharray:none;enable-background:new"
id="path3"
cx="45.129391"
cy="235.05762"
r="5.8994136" />
<circle
style="display:inline;fill:url(#linearGradient37);fill-opacity:1;stroke:none;stroke-width:3;stroke-linecap:round;stroke-dasharray:none;enable-background:new"
id="path3-6"
cx="82.086639"
cy="235.21448"
r="5.8994136" />
<path
style="display:inline;fill:none;stroke:#241f31;stroke-width:3;stroke-linecap:round;stroke-dasharray:none;stroke-opacity:1;enable-background:new"
d="m 32.392577,229.59423 c 0,0 25.649412,1.06983 24.62744,-3.62353"
id="path4" />
<path
style="display:inline;fill:none;stroke:#241f31;stroke-width:2.99972;stroke-linecap:round;stroke-dasharray:none;stroke-opacity:1;enable-background:new"
d="m 94.823791,229.75097 c 0,0 -25.649412,1.06983 -24.62744,-3.62353"
id="path4-5" />
<path
style="display:inline;fill:url(#linearGradient34);fill-opacity:1;stroke:none;stroke-width:3.24899;stroke-linecap:round;stroke-dasharray:none;enable-background:new"
d="m 23.60202,258.09454 c 1.243956,12.49842 3.858832,15.67625 12.858734,14.90301 8.999901,-0.77326 9.384671,10.27444 19.65082,5.37353 10.266149,-4.90093 14.08815,-3.56159 14.703102,-3.63198 0.614951,-0.0704 15.397528,10.41294 20.253656,3.89337 4.856129,-6.51955 7.043107,-1.94985 9.232508,-4.41272 2.1894,-2.46288 4.89442,-9.45966 3.87579,-16.22158"
id="path8-0" />
<path
style="display:inline;fill:#f6f5f4;fill-opacity:1;stroke:none;stroke-width:2.99884;stroke-linecap:round;stroke-dasharray:none;enable-background:new"
d="m 23.389225,256.86198 c 1.25195,10.5799 3.883629,13.26993 12.941363,12.61538 9.057733,-0.65456 9.444975,8.6973 19.777093,4.54868 10.332117,-4.14862 14.178679,-3.01487 14.797582,-3.07446 0.618903,-0.0596 15.49647,8.81454 20.383803,3.29574 4.887333,-5.5188 7.088365,-1.65056 9.291844,-3.73537 2.20346,-2.08482 4.92586,-8.00759 3.90069,-13.73155"
id="path8" />
<path
style="display:inline;fill:none;stroke:#241f31;stroke-width:2.99972;stroke-linecap:round;stroke-linejoin:miter;stroke-dasharray:none;stroke-opacity:1;enable-background:new"
d="m 63.095594,248.37344 c 0,0 10.15573,26.47309 21.090617,10.976"
id="path2" />
<path
style="display:inline;fill:none;stroke:#241f31;stroke-width:2.99972;stroke-linecap:round;stroke-linejoin:miter;stroke-dasharray:none;stroke-opacity:1;enable-background:new"
d="m 64.150661,248.40941 c 0,0 -10.15573,26.47309 -21.090617,10.976"
id="path2-6" />
<ellipse
style="display:inline;fill:#000000;stroke:none;stroke-width:1.29382;stroke-linecap:round;stroke-dasharray:none;enable-background:new"
id="path5"
cx="63.564262"
cy="248.16406"
rx="4.5169015"
ry="4.2407222" />
</g>
</g>
<path
d="m 40.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#6b9bd2"
id="path8" />
<path
d="m 102.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#6b9bd2"
id="path9" />
<path
d="m 40.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#82adde"
id="path10" />
<path
d="m 102.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#82adde"
id="path11" />
<path
d="m 22.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#99c1f1"
id="path12" />
<path
d="m 84.324219 80 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#99c1f1"
id="path13" />
<path
d="m 28 58 h 72 c 8.835938 0 16 7.164062 16 16 v 18 c 0 8.835938 -7.164062 16 -16 16 h -72 c -8.835938 0 -16 -7.164062 -16 -16 v -18 c 0 -8.835938 7.164062 -16 16 -16 z m 0 0"
fill="#99c1f1"
id="path14" />
<path
d="m 22.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#b6d1f2"
id="path15" />
<path
d="m 84.324219 76 h 3.351562 c 5.703125 0 10.324219 4.621094 10.324219 10.324219 v 17.351562 c 0 5.703125 -4.621094 10.324219 -10.324219 10.324219 h -3.351562 c -5.703125 0 -10.324219 -4.621094 -10.324219 -10.324219 v -17.351562 c 0 -5.703125 4.621094 -10.324219 10.324219 -10.324219 z m 0 0"
fill="#b6d1f2"
id="path16" />
<path
d="m 111 16 c -2.859375 0 -5.5 -1.523438 -6.929688 -4 c -1.425781 -2.476562 -1.425781 -5.523438 0 -8 c 1.429688 -2.476562 4.070313 -4 6.929688 -4"
fill="#bbd6f6"
id="path17" />
<path
d="m 103 8 h 8 v 16 h -8 z m 0 0"
fill="#99c1f1"
id="path18" />
<path
d="m 96 8 h 18 c 2.210938 0 4 1.789062 4 4 v 6 c 0 2.210938 -1.789062 4 -4 4 h -18 c -2.207031 0 -4 -1.789062 -4 -4 v -6 c 0 -2.210938 1.792969 -4 4 -4 z m 0 0"
fill="#e9eef4"
id="path19" />
<path
d="m 100 16 c -2.859375 0 -5.5 -1.523438 -6.929688 -4 c -1.425781 -2.476562 -1.425781 -5.523438 0 -8 c 1.429688 -2.476562 4.070313 -4 6.929688 -4"
fill="#e9eef4"
id="path20" />
<path
d="m 92 16 v 21.675781 c 0 5.71875 -4.605469 10.324219 -10.324219 10.324219 h -53.675781 c -8.863281 0 -16 7.136719 -16 16 v 24 c 0 8.863281 7.136719 16 16 16 h 72 c 8.863281 0 16 -7.136719 16 -16 v -72 z m 0 0"
fill="url(#a)"
id="path21" />
<path
d="m 92 8 h 13 v 16 h -13 z m 0 0"
fill="#e9eef4"
id="path22" />
<path
d="m 104 14 h 22 v 6 c -11.335938 0.6875 -11.078125 8.476562 -17 8 h -5 z m 0 0"
fill="#5e5c64"
id="path23" />
<path
d="m 118 14 c 0 1.378906 -1.117188 2.5 -2.5 2.5 c -1.378906 0 -2.5 -1.121094 -2.5 -2.5 s 1.121094 -2.5 2.5 -2.5 c 1.382812 0 2.5 1.121094 2.5 2.5 z m 0 0"
fill="#e9eef4"
id="path24" />
<path
d="m 113 14 c 0 1.378906 -1.121094 2.5 -2.5 2.5 c -1.382812 0 -2.5 -1.121094 -2.5 -2.5 s 1.117188 -2.5 2.5 -2.5 c 1.378906 0 2.5 1.121094 2.5 2.5 z m 0 0"
fill="#e9eef4"
id="path25" />
<path
d="m 108 14 c 0 1.378906 -1.117188 2.5 -2.5 2.5 c -1.378906 0 -2.5 -1.121094 -2.5 -2.5 s 1.121094 -2.5 2.5 -2.5 c 1.382812 0 2.5 1.121094 2.5 2.5 z m 0 0"
fill="#e9eef4"
id="path26" />
</svg>

Before

Width:  |  Height:  |  Size: 6.7 KiB

After

Width:  |  Height:  |  Size: 10 KiB

View File

@ -1,13 +1 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<g fill="#241f31">
<path d="m 12.5 3 c -0.550781 0 -1 0.449219 -1 1 s 0.449219 1 1 1 c 0.289062 0 0.5 0.210938 0.5 0.5 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 c 0 -1.367188 -1.132812 -2.5 -2.5 -2.5 z m 0 0"/>
<path d="m 2 3 c -0.550781 0 -1 0.449219 -1 1 v 11 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 v -11 c 0 -0.550781 -0.449219 -1 -1 -1 z m 10 6 c -0.550781 0 -1 0.449219 -1 1 v 5 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 v -5 c 0 -0.550781 -0.449219 -1 -1 -1 z m 0 0"/>
<path d="m 5 0 c -2.199219 0 -4 1.800781 -4 4 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 c 0 -0.722656 0.4375 -1.238281 1 -1.589844 v 1.589844 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 v -3 c 0 -0.550781 -0.449219 -1 -1 -1 z m 0 0"/>
<path d="m 10 0 c -2.199219 0 -4 1.800781 -4 4 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 c 0 -0.722656 0.4375 -1.238281 1 -1.589844 v 1.589844 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 v -3 c 0 -0.550781 -0.449219 -1 -1 -1 z m 0 0"/>
<path d="m 5 3 c -0.550781 0 -1 0.449219 -1 1 s 0.449219 1 1 1 h 2 c 0.550781 0 1 -0.449219 1 -1 s -0.449219 -1 -1 -1 z m 0 0"/>
<path d="m 10.007812 3 c -0.554687 -0.003906 -1.003906 0.441406 -1.007812 0.992188 c -0.003906 0.554687 0.441406 1.003906 0.992188 1.007812 l 2.527343 0.015625 c 0.550781 0.003906 1.003907 -0.441406 1.003907 -0.996094 c 0.003906 -0.550781 -0.441407 -1 -0.992188 -1.003906 z m 0 0"/>
<path d="m 14.019531 4.476562 c -0.550781 -0.003906 -1 0.441407 -1.003906 0.992188 l -0.015625 3.527344 c -0.003906 0.550781 0.445312 1 0.996094 1.003906 c 0.550781 0.003906 1 -0.445312 1.003906 -0.996094 l 0.015625 -3.523437 c 0.003906 -0.554688 -0.445313 -1.003907 -0.996094 -1.003907 z m 0 0"/>
<path d="m 16 8 v 2 c -3.535156 0.214844 -4.234375 2.445312 -5.648438 3.085938 c -0.5 0.226562 -1.425781 0.164062 -1.976562 0.164062 h -0.375 c -1.105469 0 -2 -0.894531 -2 -2 v -3.25"/>
</g>
</svg>
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"><g color="#000" fill="#2e3436"><path d="M7.188 2.281c-.094.056-.192.125-.29.19L5.566 3.803a1.684 1.684 0 11-2.17 2.17L2.332 7.037c.506-.069 1.017-.136 1.2.026.242.214.139 1.031.155 1.656.213.088.427.171.657.219.04.008.085-.007.125 0 .337-.525.683-1.288 1-1.344.322-.057.905.562 1.406.937a3.67 3.67 0 00.656-.468c-.195-.595-.594-1.369-.437-1.657.158-.29 1.019-.37 1.625-.531.028-.183.062-.371.062-.562 0-.075-.027-.146-.031-.22-.587-.217-1.435-.385-1.562-.687-.128-.302.34-1.021.593-1.593a3.722 3.722 0 00-.593-.532zm3.875 3.25c-.165.475-.305 1.086-.47 1.563-.43.047-.84.14-1.218.312-.38-.322-.787-.773-1.156-1.093a5.562 5.562 0 00-.688.468c.177.46.453 1.001.625 1.469-.298.309-.531.67-.719 1.063-.494 0-1.102-.084-1.593-.094a5.68 5.68 0 00-.219.812c.435.24 1.006.468 1.438.72-.006.093-.032.185-.032.28 0 .333.049.66.125.97-.382.304-.898.63-1.28.937.015.044.04.083.058.127l.613.613c.417-.1.868-.223 1.266-.303.248.343.532.626.875.875-.027.135-.068.283-.104.428.174-.063.34-.155.482-.297l1.432-1.432a1.994 1.994 0 01.533-3.918c.919 0 1.684.623 1.918 1.467l1.338-1.338c.06-.06.11-.124.156-.191-.035-.062-.06-.13-.1-.188.096-.152.205-.31.315-.47.017-.348-.1-.7-.37-.971l-.177-.176c-.28.192-.561.387-.83.555-.345-.233-.746-.383-1.156-.5-.077-.507-.107-1.132-.187-1.625a5.44 5.44 0 00-.875-.063zm-9.247.608c-.087.068-.173.138-.254.205l.014.035z" style="marker:none" overflow="visible"/><path d="M8.707.293a1 1 0 00-1.415 0l-6.999 7a1 1 0 000 1.413l7 7.001a1 1 0 001.415 0l7-7a1 1 0 000-1.413zm-.708 2.121l5.587 5.587L8 13.586 2.414 7.999z" style="line-height:normal;font-variant-ligatures:normal;font-variant-position:normal;font-variant-caps:normal;font-variant-numeric:normal;font-variant-alternates:normal;font-feature-settings:normal;text-indent:0;text-align:start;text-decoration-line:none;text-decoration-style:solid;text-decoration-color:#000;text-transform:none;text-orientation:mixed;shape-padding:0;isolation:auto;mix-blend-mode:normal;marker:none" font-weight="400" font-family="sans-serif" overflow="visible"/></g></svg>

Before

Width:  |  Height:  |  Size: 2.0 KiB

After

Width:  |  Height:  |  Size: 2.0 KiB

View File

@ -33,24 +33,4 @@ test('Validate schema file',
compile_schemas,
args: ['--strict', '--dry-run', meson.current_source_dir()])
#service_conf = configuration_data()
#service_conf.set('appid', application_id)
#service_conf.set('libexecdir', join_paths(get_option('prefix'), get_option('bindir')))
#configure_file(
#input: 'com.jeffser.Alpaca.SearchProvider.service.in',
#output: '@0@.SearchProvider.service'.format(application_id),
#configuration: service_conf,
#install_dir: join_paths(join_paths(get_option('prefix'), get_option('datadir')), 'dbus-1', 'services')
#)
#search_provider_conf = configuration_data()
#search_provider_conf.set('appid', application_id)
#configure_file(
#configuration: search_provider_conf,
#input: files('com.jeffser.Alpaca.SearchProvider.ini.in'),
#install_dir: join_paths(get_option('datadir'), 'gnome-shell', 'search-providers'),
#output: '@0@.SearchProvider.ini'.format(application_id)
#)
subdir('icons')
subdir('icons')

View File

@ -1,12 +1,11 @@
project('Alpaca', 'c',
version: '2.7.0',
project('Alpaca',
version: '0.4.0',
meson_version: '>= 0.62.0',
default_options: [ 'warning_level=2', 'werror=false', ],
)
i18n = import('i18n')
gnome = import('gnome')
application_id = 'com.jeffser.Alpaca'
subdir('data')
subdir('src')

View File

@ -1,13 +1 @@
ru
es
pt_BR
fr
nb_NO
bn
zh_Hans
hi
tr
uk
de
he
te
ru

View File

@ -3,13 +3,4 @@ data/com.jeffser.Alpaca.metainfo.xml.in
data/com.jeffser.Alpaca.gschema.xml
src/main.py
src/window.py
src/available_models_descriptions.py
src/connection_handler.py
src/window.ui
src/generic_actions.py
src/custom_widgets/chat_widget.py
src/custom_widgets/message_widget.py
src/custom_widgets/model_widget.py
src/custom_widgets/table_widget.py
src/custom_widgets/dialog_widget.py
src/custom_widgets/terminal_widget.py

File diff suppressed because it is too large Load Diff

2306
po/az.po

File diff suppressed because it is too large Load Diff

3302
po/bn.po

File diff suppressed because it is too large Load Diff

3276
po/de.po

File diff suppressed because it is too large Load Diff

3380
po/es.po

File diff suppressed because it is too large Load Diff

3379
po/fr.po

File diff suppressed because it is too large Load Diff

3070
po/he.po

File diff suppressed because it is too large Load Diff

3222
po/hi.po

File diff suppressed because it is too large Load Diff

2554
po/it.po

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

2527
po/pl.po

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

3179
po/ru.po

File diff suppressed because it is too large Load Diff

3056
po/te.po

File diff suppressed because it is too large Load Diff

3217
po/tr.po

File diff suppressed because it is too large Load Diff

3210
po/uk.po

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,93 +0,0 @@
name: jeffser-alpaca
base: core24
adopt-info: alpaca
platforms:
amd64:
arm64:
confinement: strict
grade: stable
compression: lzo
slots:
dbus-alpaca:
interface: dbus
bus: session
name: com.jeffser.Alpaca
apps:
alpaca:
command: usr/bin/alpaca
common-id: com.jeffser.Alpaca
extensions:
- gnome
plugs:
- network
- network-bind
- home
- removable-media
ollama:
command: bin/ollama
plugs:
- home
- removable-media
- network
- network-bind
ollama-daemon:
command: bin/ollama serve
daemon: simple
install-mode: enable
restart-condition: on-failure
plugs:
- home
- removable-media
- network
- network-bind
parts:
# Python dependencies
python-deps:
plugin: python
source: .
python-packages:
- requests==2.31.0
- pillow==10.3.0
- pypdf==4.2.0
- pytube==15.0.0
- html2text==2024.2.26
# Ollama plugin
ollama:
plugin: dump
source:
- on amd64: https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-amd64.tgz
- on arm64: https://github.com/ollama/ollama/releases/download/v0.3.12/ollama-linux-arm64.tgz
# Alpaca app
alpaca:
plugin: meson
source-type: git
source: https://github.com/Jeffser/Alpaca.git
source-tag: 2.6.5
source-depth: 1
meson-parameters:
- --prefix=/snap/alpaca/current/usr
override-build: |
craftctl default
sed -i '1c#!/usr/bin/env python3' $CRAFT_PART_INSTALL/snap/alpaca/current/usr/bin/alpaca
parse-info:
- usr/share/metainfo/com.jeffser.Alpaca.metainfo.xml
organize:
snap/alpaca/current: .
after: [python-deps]
deps:
plugin: nil
after: [alpaca]
stage-packages:
- libnuma1
prime:
- usr/lib/*/libnuma.so.1*

View File

@ -1,39 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<gresources>
<gresource prefix="/com/jeffser/Alpaca">
<file>style.css</file>
<file alias="icons/scalable/status/paper-plane-symbolic.svg">icons/paper-plane-symbolic.svg</file>
<file alias="icons/scalable/status/globe-symbolic.svg">icons/globe-symbolic.svg</file>
<file alias="icons/scalable/status/chat-message-new-symbolic.svg">icons/chat-message-new-symbolic.svg</file>
<file alias="icons/scalable/status/dialog-warning-symbolic.svg">icons/dialog-warning-symbolic.svg</file>
<file alias="icons/scalable/status/document-edit-symbolic.svg">icons/document-edit-symbolic.svg</file>
<file alias="icons/scalable/status/edit-copy-symbolic.svg">icons/edit-copy-symbolic.svg</file>
<file alias="icons/scalable/status/folder-download-symbolic.svg">icons/folder-download-symbolic.svg</file>
<file alias="icons/scalable/status/image-x-generic-symbolic.svg">icons/image-x-generic-symbolic.svg</file>
<file alias="icons/scalable/status/media-playback-stop-symbolic.svg">icons/media-playback-stop-symbolic.svg</file>
<file alias="icons/scalable/status/open-menu-symbolic.svg">icons/open-menu-symbolic.svg</file>
<file alias="icons/scalable/status/settings-symbolic.svg">icons/settings-symbolic.svg</file>
<file alias="icons/scalable/status/sidebar-show-symbolic.svg">icons/sidebar-show-symbolic.svg</file>
<file alias="icons/scalable/status/user-trash-symbolic.svg">icons/user-trash-symbolic.svg</file>
<file alias="icons/scalable/status/view-more-symbolic.svg">icons/view-more-symbolic.svg</file>
<file alias="icons/scalable/status/document-open-symbolic.svg">icons/document-open-symbolic.svg</file>
<file alias="icons/scalable/status/list-add-symbolic.svg">icons/list-add-symbolic.svg</file>
<file alias="icons/scalable/status/brain-augemnted-symbolic.svg">icons/brain-augemnted-symbolic.svg</file>
<file alias="icons/scalable/status/chain-link-loose-symbolic.svg">icons/chain-link-loose-symbolic.svg</file>
<file alias="icons/scalable/status/document-text-symbolic.svg">icons/document-text-symbolic.svg</file>
<file alias="icons/scalable/status/play-symbolic.svg">icons/play-symbolic.svg</file>
<file alias="icons/scalable/status/step-back-symbolic.svg">icons/step-back-symbolic.svg</file>
<file alias="icons/scalable/status/step-over-symbolic.svg">icons/step-over-symbolic.svg</file>
<file alias="icons/scalable/status/share-symbolic.svg">icons/share-symbolic.svg</file>
<file alias="icons/scalable/status/edit-find-symbolic.svg">icons/edit-find-symbolic.svg</file>
<file alias="icons/scalable/status/edit-symbolic.svg">icons/edit-symbolic.svg</file>
<file alias="icons/scalable/status/image-missing-symbolic.svg">icons/image-missing-symbolic.svg</file>
<file alias="icons/scalable/status/update-symbolic.svg">icons/update-symbolic.svg</file>
<file alias="icons/scalable/status/down-symbolic.svg">icons/down-symbolic.svg</file>
<file alias="icons/scalable/status/chat-bubble-text-symbolic.svg">icons/chat-bubble-text-symbolic.svg</file>
<file alias="icons/scalable/status/execute-from-symbolic.svg">icons/execute-from-symbolic.svg</file>
<file alias="icons/scalable/status/cross-large-symbolic.svg">icons/cross-large-symbolic.svg</file>
<file alias="icons/scalable/status/info-outline-symbolic.svg">icons/info-outline-symbolic.svg</file>
<file preprocess="xml-stripblanks">window.ui</file>
<file preprocess="xml-stripblanks">gtk/help-overlay.ui</file>
</gresource>

File diff suppressed because it is too large Load Diff

92
src/available_models.py Normal file
View File

@ -0,0 +1,92 @@
# available_models.py
# There isn't an API to do this, sorry
available_models = {
"llama3":"Meta Llama 3: The most capable openly available LLM to date",
"phi3":"Phi-3 Mini is a 3.8B parameters, lightweight, state-of-the-art open model by Microsoft.",
"wizardlm2":"State of the art large language model from Microsoft AI with improved performance on complex chat, multilingual, reasoning and agent use cases.",
"mistral":"The 7B model released by Mistral AI, updated to version 0.2.",
"gemma":"Gemma is a family of lightweight, state-of-the-art open models built by Google DeepMind. Updated to version 1.1",
"mixtral":"A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.",
"llama2":"Llama 2 is a collection of foundation language models ranging from 7B to 70B parameters.",
"codegemma":"CodeGemma is a collection of powerful, lightweight models that can perform a variety of coding tasks like fill-in-the-middle code completion, code generation, natural language understanding, mathematical reasoning, and instruction following.",
"command-r":"Command R is a Large Language Model optimized for conversational interaction and long context tasks.",
"command-r-plus":"Command R+ is a powerful, scalable large language model purpose-built to excel at real-world enterprise use cases.",
"llava":"🌋 LLaVA is a novel end-to-end trained large multimodal model that combines a vision encoder and Vicuna for general-purpose visual and language understanding. Updated to version 1.6.",
"dbrx":"DBRX is an open, general-purpose LLM created by Databricks.",
"codellama":"A large language model that can use text prompts to generate and discuss code.",
"qwen":"Qwen 1.5 is a series of large language models by Alibaba Cloud spanning from 0.5B to 110B parameters",
"dolphin-mixtral":"Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.",
"llama2-uncensored":"Uncensored Llama 2 model by George Sung and Jarrad Hope.",
"deepseek-coder":"DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens.",
"mistral-openorca":"Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset.",
"nomic-embed-text":"A high-performing open embedding model with a large token context window.",
"phi":"Phi-2: a 2.7B language model by Microsoft Research that demonstrates outstanding reasoning and language understanding capabilities.",
"dolphin-mistral":"The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8.",
"orca-mini":"A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware.",
"nous-hermes2":"The powerful family of models by Nous Research that excels at scientific discussion and coding tasks.",
"zephyr":"Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants.",
"llama2-chinese":"Llama 2 based model fine tuned to improve Chinese dialogue ability.",
"wizard-vicuna-uncensored":"Wizard Vicuna Uncensored is a 7B, 13B, and 30B parameter model based on Llama 2 uncensored by Eric Hartford.",
"vicuna":"General use chat model based on Llama and Llama 2 with 2K to 16K context sizes.",
"starcoder2":"StarCoder2 is the next generation of transparently trained open code LLMs that comes in three sizes: 3B, 7B and 15B parameters.",
"openhermes":"OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets.",
"tinyllama":"The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.",
"openchat":"A family of open-source models trained on a wide variety of data, surpassing ChatGPT on various benchmarks. Updated to version 3.5-0106.",
"tinydolphin":"An experimental 1.1B parameter model trained on the new Dolphin 2.8 dataset by Eric Hartford and based on TinyLlama.",
"starcoder":"StarCoder is a code generation model trained on 80+ programming languages.",
"wizardcoder":"State-of-the-art code generation model",
"stable-code":"Stable Code 3B is a coding model with instruct and code completion variants on par with models such as Code Llama 7B that are 2.5x larger.",
"dolphin-llama3":"Dolphin 2.9 is a new model with 8B and 70B sizes by Eric Hartford based on Llama 3 that has a variety of instruction, conversational, and coding skills.",
"yi":"A high-performing, bilingual language model.",
"mxbai-embed-large":"State-of-the-art large embedding model from mixedbread.ai",
"neural-chat":"A fine-tuned model based on Mistral with good coverage of domain and language.",
"phind-codellama":"Code generation model based on Code Llama.",
"wizard-math":"Model focused on math and logic problems",
"starling-lm":"Starling is a large language model trained by reinforcement learning from AI feedback focused on improving chatbot helpfulness.",
"falcon":"A large language model built by the Technology Innovation Institute (TII) for use in summarization, text generation, and chat bots.",
"orca2":"Orca 2 is built by Microsoft research, and are a fine-tuned version of Meta's Llama 2 models. The model is designed to excel particularly in reasoning.",
"dolphincoder":"A 7B and 15B uncensored variant of the Dolphin model family that excels at coding, based on StarCoder2.",
"dolphin-phi":"2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research.",
"nous-hermes":"General use models based on Llama and Llama 2 from Nous Research.",
"sqlcoder":"SQLCoder is a code completion model fined-tuned on StarCoder for SQL generation tasks",
"solar":"A compact, yet powerful 10.7B large language model designed for single-turn conversation.",
"stablelm2":"Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.",
"bakllava":"BakLLaVA is a multimodal model consisting of the Mistral 7B base model augmented with the LLaVA architecture.",
"medllama2":"Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset.",
"yarn-llama2":"An extension of Llama 2 that supports a context of up to 128k tokens.",
"deepseek-llm":"An advanced language model crafted with 2 trillion bilingual tokens.",
"nous-hermes2-mixtral":"The Nous Hermes 2 model from Nous Research, now trained over Mixtral.",
"wizardlm-uncensored":"Uncensored version of Wizard LM model",
"codeqwen":"CodeQwen1.5 is a large language model pretrained on a large amount of code data.",
"all-minilm":"Embedding models on very large sentence level datasets.",
"samantha-mistral":"A companion assistant trained in philosophy, psychology, and personal relationships. Based on Mistral.",
"codeup":"Great code generation model based on Llama2.",
"stable-beluga":"Llama 2 based model fine tuned on an Orca-style dataset. Originally called Free Willy.",
"llama3-gradient":"This model extends LLama-3 8B's context length from 8k to over 1m tokens.",
"everythinglm":"Uncensored Llama2 based model with support for a 16K context window.",
"xwinlm":"Conversational model based on Llama 2 that performs competitively on various benchmarks.",
"yarn-mistral":"An extension of Mistral to support context windows of 64K or 128K.",
"meditron":"Open-source medical large language model adapted from Llama 2 to the medical domain.",
"wizardlm":"General use model based on Llama 2.",
"llama-pro":"An expansion of Llama 2 that specializes in integrating both general language understanding and domain-specific knowledge, particularly in programming and mathematics.",
"magicoder":"🎩 Magicoder is a family of 7B parameter models trained on 75K synthetic instruction data using OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets.",
"stablelm-zephyr":"A lightweight chat model allowing accurate, and responsive output without requiring high-end hardware.",
"codebooga":"A high-performing code instruct model created by merging two existing code models.",
"nexusraven":"Nexus Raven is a 13B instruction tuned model for function calling tasks.",
"mistrallite":"MistralLite is a fine-tuned model based on Mistral with enhanced capabilities of processing long contexts.",
"wizard-vicuna":"Wizard Vicuna is a 13B parameter model based on Llama 2 trained by MelodysDreamj.",
"goliath":"A language model created by combining two fine-tuned Llama 2 70B models into one.",
"open-orca-platypus2":"Merge of the Open Orca OpenChat model and the Garage-bAInd Platypus 2 model. Designed for chat and code generation.",
"notux":"A top-performing mixture of experts model, fine-tuned with high-quality data.",
"megadolphin":"MegaDolphin-2.2-120b is a transformation of Dolphin-2.2-70b created by interleaving the model with itself.",
"snowflake-arctic-embed":"A suite of text embedding models by Snowflake, optimized for performance.",
"duckdb-nsql":"7B parameter text-to-SQL model made by MotherDuck and Numbers Station.",
"moondream":"moondream is a small vision language model designed to run efficiently on edge devices.",
"notus":"A 7B chat model fine-tuned with high-quality data and based on Zephyr.",
"alfred":"A robust conversational model designed to be used for both chat and instruct use cases.",
"llava-llama3":"A LLaVA model fine-tuned from Llama 3 Instruct with better scores in several benchmarks.",
"llama3-chatqa":"A model from NVIDIA based on Llama 3 that excels at conversational question answering (QA) and retrieval-augmented generation (RAG).",
"llava-phi3":"A new small LLaVA model fine-tuned from Phi 3 Mini."
}

View File

@ -1,124 +0,0 @@
descriptions = {
'llama3.2': _("Meta's Llama 3.2 goes small with 1B and 3B models."),
'llama3.1': _("Llama 3.1 is a new state-of-the-art model from Meta available in 8B, 70B and 405B parameter sizes."),
'gemma2': _("Google Gemma 2 is a high-performing and efficient model available in three sizes: 2B, 9B, and 27B."),
'qwen2.5': _("Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset, encompassing up to 18 trillion tokens. The model supports up to 128K tokens and has multilingual support."),
'phi3.5': _("A lightweight AI model with 3.8 billion parameters with performance overtaking similarly and larger sized models."),
'nemotron-mini': _("A commercial-friendly small language model by NVIDIA optimized for roleplay, RAG QA, and function calling."),
'mistral-small': _("Mistral Small is a lightweight model designed for cost-effective use in tasks like translation and summarization."),
'mistral-nemo': _("A state-of-the-art 12B model with 128k context length, built by Mistral AI in collaboration with NVIDIA."),
'deepseek-coder-v2': _("An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks."),
'mistral': _("The 7B model released by Mistral AI, updated to version 0.3."),
'mixtral': _("A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes."),
'codegemma': _("CodeGemma is a collection of powerful, lightweight models that can perform a variety of coding tasks like fill-in-the-middle code completion, code generation, natural language understanding, mathematical reasoning, and instruction following."),
'command-r': _("Command R is a Large Language Model optimized for conversational interaction and long context tasks."),
'command-r-plus': _("Command R+ is a powerful, scalable large language model purpose-built to excel at real-world enterprise use cases."),
'llava': _("🌋 LLaVA is a novel end-to-end trained large multimodal model that combines a vision encoder and Vicuna for general-purpose visual and language understanding. Updated to version 1.6."),
'llama3': _("Meta Llama 3: The most capable openly available LLM to date"),
'gemma': _("Gemma is a family of lightweight, state-of-the-art open models built by Google DeepMind. Updated to version 1.1"),
'qwen': _("Qwen 1.5 is a series of large language models by Alibaba Cloud spanning from 0.5B to 110B parameters"),
'qwen2': _("Qwen2 is a new series of large language models from Alibaba group"),
'phi3': _("Phi-3 is a family of lightweight 3B (Mini) and 14B (Medium) state-of-the-art open models by Microsoft."),
'llama2': _("Llama 2 is a collection of foundation language models ranging from 7B to 70B parameters."),
'codellama': _("A large language model that can use text prompts to generate and discuss code."),
'nomic-embed-text': _("A high-performing open embedding model with a large token context window."),
'mxbai-embed-large': _("State-of-the-art large embedding model from mixedbread.ai"),
'dolphin-mixtral': _("Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford."),
'phi': _("Phi-2: a 2.7B language model by Microsoft Research that demonstrates outstanding reasoning and language understanding capabilities."),
'deepseek-coder': _("DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens."),
'starcoder2': _("StarCoder2 is the next generation of transparently trained open code LLMs that comes in three sizes: 3B, 7B and 15B parameters."),
'llama2-uncensored': _("Uncensored Llama 2 model by George Sung and Jarrad Hope."),
'dolphin-mistral': _("The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8."),
'zephyr': _("Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants."),
'yi': _("Yi 1.5 is a high-performing, bilingual language model."),
'dolphin-llama3': _("Dolphin 2.9 is a new model with 8B and 70B sizes by Eric Hartford based on Llama 3 that has a variety of instruction, conversational, and coding skills."),
'orca-mini': _("A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware."),
'llava-llama3': _("A LLaVA model fine-tuned from Llama 3 Instruct with better scores in several benchmarks."),
'qwen2.5-coder': _("The latest series of Code-Specific Qwen models, with significant improvements in code generation, code reasoning, and code fixing."),
'mistral-openorca': _("Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset."),
'starcoder': _("StarCoder is a code generation model trained on 80+ programming languages."),
'tinyllama': _("The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens."),
'codestral': _("Codestral is Mistral AIs first-ever code model designed for code generation tasks."),
'vicuna': _("General use chat model based on Llama and Llama 2 with 2K to 16K context sizes."),
'llama2-chinese': _("Llama 2 based model fine tuned to improve Chinese dialogue ability."),
'snowflake-arctic-embed': _("A suite of text embedding models by Snowflake, optimized for performance."),
'wizard-vicuna-uncensored': _("Wizard Vicuna Uncensored is a 7B, 13B, and 30B parameter model based on Llama 2 uncensored by Eric Hartford."),
'granite-code': _("A family of open foundation models by IBM for Code Intelligence"),
'codegeex4': _("A versatile model for AI software development scenarios, including code completion."),
'nous-hermes2': _("The powerful family of models by Nous Research that excels at scientific discussion and coding tasks."),
'all-minilm': _("Embedding models on very large sentence level datasets."),
'openchat': _("A family of open-source models trained on a wide variety of data, surpassing ChatGPT on various benchmarks. Updated to version 3.5-0106."),
'aya': _("Aya 23, released by Cohere, is a new family of state-of-the-art, multilingual models that support 23 languages."),
'codeqwen': _("CodeQwen1.5 is a large language model pretrained on a large amount of code data."),
'wizardlm2': _("State of the art large language model from Microsoft AI with improved performance on complex chat, multilingual, reasoning and agent use cases."),
'tinydolphin': _("An experimental 1.1B parameter model trained on the new Dolphin 2.8 dataset by Eric Hartford and based on TinyLlama."),
'wizardcoder': _("State-of-the-art code generation model"),
'stable-code': _("Stable Code 3B is a coding model with instruct and code completion variants on par with models such as Code Llama 7B that are 2.5x larger."),
'openhermes': _("OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets."),
'qwen2-math': _("Qwen2 Math is a series of specialized math language models built upon the Qwen2 LLMs, which significantly outperforms the mathematical capabilities of open-source models and even closed-source models (e.g., GPT4o)."),
'bakllava': _("BakLLaVA is a multimodal model consisting of the Mistral 7B base model augmented with the LLaVA architecture."),
'stablelm2': _("Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch."),
'llama3-gradient': _("This model extends LLama-3 8B's context length from 8k to over 1m tokens."),
'deepseek-llm': _("An advanced language model crafted with 2 trillion bilingual tokens."),
'wizard-math': _("Model focused on math and logic problems"),
'glm4': _("A strong multi-lingual general language model with competitive performance to Llama 3."),
'neural-chat': _("A fine-tuned model based on Mistral with good coverage of domain and language."),
'reflection': _("A high-performing model trained with a new technique called Reflection-tuning that teaches a LLM to detect mistakes in its reasoning and correct course."),
'llama3-chatqa': _("A model from NVIDIA based on Llama 3 that excels at conversational question answering (QA) and retrieval-augmented generation (RAG)."),
'mistral-large': _("Mistral Large 2 is Mistral's new flagship model that is significantly more capable in code generation, mathematics, and reasoning with 128k context window and support for dozens of languages."),
'moondream': _("moondream2 is a small vision language model designed to run efficiently on edge devices."),
'xwinlm': _("Conversational model based on Llama 2 that performs competitively on various benchmarks."),
'phind-codellama': _("Code generation model based on Code Llama."),
'nous-hermes': _("General use models based on Llama and Llama 2 from Nous Research."),
'sqlcoder': _("SQLCoder is a code completion model fined-tuned on StarCoder for SQL generation tasks"),
'dolphincoder': _("A 7B and 15B uncensored variant of the Dolphin model family that excels at coding, based on StarCoder2."),
'yarn-llama2': _("An extension of Llama 2 that supports a context of up to 128k tokens."),
'smollm': _("🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset."),
'wizardlm': _("General use model based on Llama 2."),
'deepseek-v2': _("A strong, economical, and efficient Mixture-of-Experts language model."),
'starling-lm': _("Starling is a large language model trained by reinforcement learning from AI feedback focused on improving chatbot helpfulness."),
'samantha-mistral': _("A companion assistant trained in philosophy, psychology, and personal relationships. Based on Mistral."),
'solar': _("A compact, yet powerful 10.7B large language model designed for single-turn conversation."),
'orca2': _("Orca 2 is built by Microsoft research, and are a fine-tuned version of Meta's Llama 2 models. The model is designed to excel particularly in reasoning."),
'stable-beluga': _("Llama 2 based model fine tuned on an Orca-style dataset. Originally called Free Willy."),
'dolphin-phi': _("2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research."),
'wizardlm-uncensored': _("Uncensored version of Wizard LM model"),
'hermes3': _("Hermes 3 is the latest version of the flagship Hermes series of LLMs by Nous Research"),
'yi-coder': _("Yi-Coder is a series of open-source code language models that delivers state-of-the-art coding performance with fewer than 10 billion parameters."),
'llava-phi3': _("A new small LLaVA model fine-tuned from Phi 3 Mini."),
'internlm2': _("InternLM2.5 is a 7B parameter model tailored for practical scenarios with outstanding reasoning capability."),
'yarn-mistral': _("An extension of Mistral to support context windows of 64K or 128K."),
'llama-pro': _("An expansion of Llama 2 that specializes in integrating both general language understanding and domain-specific knowledge, particularly in programming and mathematics."),
'medllama2': _("Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset."),
'meditron': _("Open-source medical large language model adapted from Llama 2 to the medical domain."),
'nexusraven': _("Nexus Raven is a 13B instruction tuned model for function calling tasks."),
'nous-hermes2-mixtral': _("The Nous Hermes 2 model from Nous Research, now trained over Mixtral."),
'codeup': _("Great code generation model based on Llama2."),
'llama3-groq-tool-use': _("A series of models from Groq that represent a significant advancement in open-source AI capabilities for tool use/function calling."),
'everythinglm': _("Uncensored Llama2 based model with support for a 16K context window."),
'magicoder': _("🎩 Magicoder is a family of 7B parameter models trained on 75K synthetic instruction data using OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets."),
'stablelm-zephyr': _("A lightweight chat model allowing accurate, and responsive output without requiring high-end hardware."),
'codebooga': _("A high-performing code instruct model created by merging two existing code models."),
'wizard-vicuna': _("Wizard Vicuna is a 13B parameter model based on Llama 2 trained by MelodysDreamj."),
'mistrallite': _("MistralLite is a fine-tuned model based on Mistral with enhanced capabilities of processing long contexts."),
'falcon2': _("Falcon2 is an 11B parameters causal decoder-only model built by TII and trained over 5T tokens."),
'duckdb-nsql': _("7B parameter text-to-SQL model made by MotherDuck and Numbers Station."),
'minicpm-v': _("A series of multimodal LLMs (MLLMs) designed for vision-language understanding."),
'megadolphin': _("MegaDolphin-2.2-120b is a transformation of Dolphin-2.2-70b created by interleaving the model with itself."),
'notux': _("A top-performing mixture of experts model, fine-tuned with high-quality data."),
'goliath': _("A language model created by combining two fine-tuned Llama 2 70B models into one."),
'open-orca-platypus2': _("Merge of the Open Orca OpenChat model and the Garage-bAInd Platypus 2 model. Designed for chat and code generation."),
'notus': _("A 7B chat model fine-tuned with high-quality data and based on Zephyr."),
'bge-m3': _("BGE-M3 is a new model from BAAI distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity."),
'mathstral': _("MathΣtral: a 7B model designed for math reasoning and scientific discovery by Mistral AI."),
'dbrx': _("DBRX is an open, general-purpose LLM created by Databricks."),
'solar-pro': _("Solar Pro Preview: an advanced large language model (LLM) with 22 billion parameters designed to fit into a single GPU"),
'nuextract': _("A 3.8B model fine-tuned on a private high-quality synthetic dataset for information extraction, based on Phi-3."),
'alfred': _("A robust conversational model designed to be used for both chat and instruct use cases."),
'firefunction-v2': _("An open weights function calling model based on Llama 3, competitive with GPT-4o function calling capabilities."),
'reader-lm': _("A series of models that convert HTML content to Markdown content, which is useful for content conversion tasks."),
'bge-large': _("Embedding model from BAAI mapping texts to vectors."),
'deepseek-v2.5': _("An upgraded version of DeekSeek-V2 that integrates the general and coding abilities of both DeepSeek-V2-Chat and DeepSeek-Coder-V2-Instruct."),
'bespoke-minicheck': _("A state-of-the-art fact-checking model developed by Bespoke Labs."),
'paraphrase-multilingual': _("Sentence-transformers model that can be used for tasks like clustering or semantic search."),
}

View File

@ -1,156 +1,61 @@
# connection_handler.py
"""
Handles requests to remote and integrated instances of Ollama
"""
import json, os, requests, subprocess, threading, shutil
from .internal import data_dir, cache_dir
from logging import getLogger
from time import sleep
# connectionhandler.py
import json, requests
logger = getLogger(__name__)
window = None
AMD_support_label = "\n<a href='https://github.com/Jeffser/Alpaca/wiki/AMD-Support'>{}</a>".format(_('Alpaca Support'))
def log_output(pipe):
with open(os.path.join(data_dir, 'tmp.log'), 'a') as f:
with pipe:
try:
for line in iter(pipe.readline, ''):
print(line, end='')
f.write(line)
f.flush()
if 'msg="model request too large for system"' in line:
window.show_toast(_("Model request too large for system"), window.main_overlay)
elif 'msg="amdgpu detected, but no compatible rocm library found.' in line:
if bool(os.getenv("FLATPAK_ID")):
window.ollama_information_label.set_label(_("AMD GPU detected but the extension is missing, Ollama will use CPU.") + AMD_support_label)
else:
window.ollama_information_label.set_label(_("AMD GPU detected but ROCm is missing, Ollama will use CPU.") + AMD_support_label)
window.ollama_information_label.set_css_classes(['dim-label', 'error'])
elif 'msg="amdgpu is supported"' in line:
window.ollama_information_label.set_label(_("Using AMD GPU type '{}'").format(line.split('=')[-1]))
window.ollama_information_label.set_css_classes(['dim-label', 'success'])
except Exception as e:
pass
class instance():
def __init__(self, local_port:int, remote_url:str, remote:bool, tweaks:dict, overrides:dict, bearer_token:str, idle_timer_delay:int):
self.local_port=local_port
self.remote_url=remote_url
self.remote=remote
self.tweaks=tweaks
self.overrides=overrides
self.bearer_token=bearer_token
self.idle_timer_delay=idle_timer_delay
self.idle_timer_stop_event=threading.Event()
self.idle_timer=None
self.instance=None
self.busy=0
if not self.remote:
self.start()
def get_headers(self, include_json:bool) -> dict:
headers = {}
if include_json:
headers["Content-Type"] = "application/json"
if self.bearer_token and self.remote:
headers["Authorization"] = "Bearer " + self.bearer_token
return headers if len(headers.keys()) > 0 else None
def request(self, connection_type:str, connection_url:str, data:dict=None, callback:callable=None) -> requests.models.Response:
self.busy += 1
if self.idle_timer and not self.remote:
self.idle_timer_stop_event.set()
self.idle_timer=None
if not self.instance and not self.remote:
self.start()
connection_url = '{}/{}'.format(self.remote_url if self.remote else 'http://127.0.0.1:{}'.format(self.local_port), connection_url)
logger.info('{} : {}'.format(connection_type, connection_url))
response = None
match connection_type:
case "GET":
response = requests.get(connection_url, headers=self.get_headers(False))
case "POST":
if callback:
response = requests.post(connection_url, headers=self.get_headers(True), data=data, stream=True)
if response.status_code == 200:
for line in response.iter_lines():
if line:
callback(json.loads(line.decode("utf-8")))
else:
response = requests.post(connection_url, headers=self.get_headers(True), data=data, stream=False)
case "DELETE":
response = requests.delete(connection_url, headers=self.get_headers(False), data=data)
self.busy -= 1
if not self.idle_timer and not self.remote:
self.start_timer()
return response
def run_timer(self):
if not self.idle_timer_stop_event.wait(self.idle_timer_delay*60):
window.show_toast(_("Ollama instance was shut down due to inactivity"), window.main_overlay)
self.stop()
def start_timer(self):
if self.busy == 0:
if self.idle_timer:
self.idle_timer_stop_event.set()
self.idle_timer=None
if self.idle_timer_delay > 0 and self.busy == 0:
self.idle_timer_stop_event.clear()
self.idle_timer = threading.Thread(target=self.run_timer)
self.idle_timer.start()
def start(self):
self.stop()
if shutil.which('ollama'):
if not os.path.isdir(os.path.join(cache_dir, 'tmp/ollama')):
os.mkdir(os.path.join(cache_dir, 'tmp/ollama'))
self.instance = None
params = self.overrides.copy()
params["OLLAMA_DEBUG"] = "1"
params["OLLAMA_HOST"] = f"127.0.0.1:{self.local_port}" # You can't change this directly sorry :3
params["HOME"] = data_dir
params["TMPDIR"] = os.path.join(cache_dir, 'tmp/ollama')
instance = subprocess.Popen(["ollama", "serve"], env={**os.environ, **params}, stderr=subprocess.PIPE, stdout=subprocess.PIPE, text=True)
threading.Thread(target=log_output, args=(instance.stdout,)).start()
threading.Thread(target=log_output, args=(instance.stderr,)).start()
logger.info("Starting Alpaca's Ollama instance...")
logger.debug(params)
logger.info("Started Alpaca's Ollama instance")
try:
v_str = subprocess.check_output("ollama -v", shell=True).decode('utf-8')
logger.info(v_str.split('\n')[1].strip('Warning: ').strip())
except Exception as e:
logger.error(e)
self.instance = instance
if not self.idle_timer:
self.start_timer()
window.ollama_information_label.set_label(_("Integrated Ollama instance is running"))
window.ollama_information_label.set_css_classes(['dim-label', 'success'])
def simple_get(connection_url:str) -> dict:
try:
response = requests.get(connection_url)
if response.status_code == 200:
return {"status": "ok", "text": response.text, "status_code": response.status_code}
else:
self.remote = True
window.remote_connection_switch.set_sensitive(True)
window.remote_connection_switch.set_active(True)
return {"status": "error", "status_code": response.status_code}
except Exception as e:
return {"status": "error", "status_code": 0}
def stop(self):
if self.idle_timer:
self.idle_timer_stop_event.set()
self.idle_timer=None
if self.instance:
logger.info("Stopping Alpaca's Ollama instance")
self.instance.terminate()
self.instance.wait()
self.instance = None
window.ollama_information_label.set_label(_("Integrated Ollama instance is not running"))
window.ollama_information_label.set_css_classes(['dim-label'])
logger.info("Stopped Alpaca's Ollama instance")
def simple_delete(connection_url:str, data) -> dict:
try:
response = requests.delete(connection_url, json=data)
if response.status_code == 200:
return {"status": "ok", "status_code": response.status_code}
else:
return {"status": "error", "text": "Failed to delete", "status_code": response.status_code}
except Exception as e:
return {"status": "error", "status_code": 0}
def reset(self):
logger.info("Resetting Alpaca's Ollama instance")
self.stop()
sleep(1)
self.start()
def stream_post(connection_url:str, data, callback:callable) -> dict:
try:
headers = {
"Content-Type": "application/json"
}
response = requests.post(connection_url, headers=headers, data=data, stream=True)
if response.status_code == 200:
for line in response.iter_lines():
if line:
callback(json.loads(line.decode("utf-8")))
return {"status": "ok", "status_code": response.status_code}
else:
return {"status": "error", "status_code": response.status_code}
except Exception as e:
return {"status": "error", "status_code": 0}
from time import sleep
def stream_post_fake(connection_url:str, data, callback:callable) -> dict:
data = {
"status": "pulling manifest"
}
callback(data)
for i in range(2):
for a in range(11):
sleep(.1)
data = {
"status": f"downloading digestname {i}",
"digest": f"digestname {i}",
"total": 500,
"completed": a * 50
}
callback(data)
for msg in ["verifying sha256 digest", "writting manifest", "removing any unused layers", "success"]:
sleep(.1)
data = {"status": msg}
callback(data)
return {"status": "ok", "status_code": 200}

View File

@ -1,458 +0,0 @@
#chat_widget.py
"""
Handles the chat widget (testing)
"""
import gi
gi.require_version('Gtk', '4.0')
gi.require_version('GtkSource', '5')
from gi.repository import Gtk, Gio, Adw, Gdk, GLib
import logging, os, datetime, shutil, random, tempfile, tarfile, json
from ..internal import data_dir
from .message_widget import message
logger = logging.getLogger(__name__)
window = None
possible_prompts = [
"What can you do?",
"Give me a pancake recipe",
"Why is the sky blue?",
"Can you tell me a joke?",
"Give me a healthy breakfast recipe",
"How to make a pizza",
"Can you write a poem?",
"Can you write a story?",
"What is GNU-Linux?",
"Which is the best Linux distro?",
"Why is Pluto not a planet?",
"What is a black-hole?",
"Tell me how to stay fit",
"Write a conversation between sun and Earth",
"Why is the grass green?",
"Write an Haïku about AI",
"What is the meaning of life?",
"Explain quantum physics in simple terms",
"Explain the theory of relativity",
"Explain how photosynthesis works",
"Recommend a film about nature",
"What is nostalgia?"
]
class chat(Gtk.ScrolledWindow):
__gtype_name__ = 'AlpacaChat'
def __init__(self, name:str):
self.container = Gtk.Box(
orientation=1,
hexpand=True,
vexpand=True,
spacing=12,
margin_top=12,
margin_bottom=12,
margin_start=12,
margin_end=12
)
self.clamp = Adw.Clamp(
maximum_size=1000,
tightening_threshold=800,
child=self.container
)
super().__init__(
child=self.clamp,
propagate_natural_height=True,
kinetic_scrolling=True,
vexpand=True,
hexpand=True,
css_classes=["undershoot-bottom"],
name=name,
hscrollbar_policy=2
)
self.messages = {}
self.welcome_screen = None
self.regenerate_button = None
self.busy = False
#self.get_vadjustment().connect('notify::page-size', lambda va, *_: va.set_value(va.get_upper() - va.get_page_size()) if va.get_value() == 0 else None)
##TODO Figure out how to do this with the search thing
def stop_message(self):
self.busy = False
window.switch_send_stop_button(True)
def clear_chat(self):
if self.busy:
self.stop_message()
self.messages = {}
self.stop_message()
for widget in list(self.container):
self.container.remove(widget)
self.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
print('clear chat for some reason')
def add_message(self, message_id:str, model:str=None):
msg = message(message_id, model)
self.messages[message_id] = msg
self.container.append(msg)
def send_sample_prompt(self, prompt):
buffer = window.message_text_view.get_buffer()
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
buffer.insert(buffer.get_start_iter(), prompt, len(prompt.encode('utf-8')))
window.send_message()
def show_welcome_screen(self, show_prompts:bool):
if self.welcome_screen:
self.container.remove(self.welcome_screen)
self.welcome_screen = None
if len(list(self.container)) > 0:
self.clear_chat()
return
button_container = Gtk.Box(
orientation=1,
spacing=10,
halign=3
)
if show_prompts:
for prompt in random.sample(possible_prompts, 3):
prompt_button = Gtk.Button(
label=prompt,
tooltip_text=_("Send prompt: '{}'").format(prompt)
)
prompt_button.connect('clicked', lambda *_, prompt=prompt : self.send_sample_prompt(prompt))
button_container.append(prompt_button)
else:
button = Gtk.Button(
label=_("Open Model Manager"),
tooltip_text=_("Open Model Manager"),
css_classes=["suggested-action", "pill"]
)
button.set_action_name('app.manage_models')
button_container.append(button)
self.welcome_screen = Adw.StatusPage(
icon_name="com.jeffser.Alpaca",
title="Alpaca",
description=_("Try one of these prompts") if show_prompts else _("It looks like you don't have any models downloaded yet. Download models to get started!"),
child=button_container,
vexpand=True
)
self.container.append(self.welcome_screen)
def load_chat_messages(self, messages:dict):
if len(messages.keys()) > 0:
if self.welcome_screen:
self.container.remove(self.welcome_screen)
self.welcome_screen = None
for message_id, message_data in messages.items():
if message_data['content']:
self.add_message(message_id, message_data['model'] if message_data['role'] == 'assistant' else None)
message_element = self.messages[message_id]
if 'images' in message_data:
images=[]
for image in message_data['images']:
images.append(os.path.join(data_dir, "chats", self.get_name(), message_id, image))
message_element.add_images(images)
if 'files' in message_data:
files={}
for file_name, file_type in message_data['files'].items():
files[os.path.join(data_dir, "chats", self.get_name(), message_id, file_name)] = file_type
message_element.add_attachments(files)
GLib.idle_add(message_element.set_text, message_data['content'])
GLib.idle_add(message_element.add_footer, datetime.datetime.strptime(message_data['date'] + (":00" if message_data['date'].count(":") == 1 else ""), '%Y/%m/%d %H:%M:%S'))
else:
self.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
def messages_to_dict(self) -> dict:
messages_dict = {}
for message_id, message_element in self.messages.items():
if message_element.text and message_element.dt:
messages_dict[message_id] = {
'role': 'assistant' if message_element.bot else 'user',
'model': message_element.model,
'date': message_element.dt.strftime("%Y/%m/%d %H:%M:%S"),
'content': message_element.text
}
if message_element.image_c:
images = []
for file in message_element.image_c.files:
images.append(file.image_name)
messages_dict[message_id]['images'] = images
if message_element.attachment_c:
files = {}
for file in message_element.attachment_c.files:
files[file.file_name] = file.file_type
messages_dict[message_id]['files'] = files
return messages_dict
def show_regenerate_button(self, msg:message):
if self.regenerate_button:
self.remove(self.regenerate_button)
self.regenerate_button = Gtk.Button(
child=Adw.ButtonContent(
icon_name='update-symbolic',
label=_('Regenerate Response')
),
css_classes=["suggested-action"],
halign=3
)
self.regenerate_button.connect('clicked', lambda *_: msg.action_buttons.regenerate_message())
self.container.append(self.regenerate_button)
class chat_tab(Gtk.ListBoxRow):
__gtype_name__ = 'AlpacaChatTab'
def __init__(self, chat_window:chat):
self.chat_window=chat_window
self.spinner = Gtk.Spinner(
spinning=True,
visible=False
)
self.label = Gtk.Label(
label=self.chat_window.get_name(),
tooltip_text=self.chat_window.get_name(),
hexpand=True,
halign=0,
wrap=True,
ellipsize=3,
wrap_mode=2,
xalign=0
)
self.indicator = Gtk.Image.new_from_icon_name("chat-bubble-text-symbolic")
self.indicator.set_visible(False)
self.indicator.set_css_classes(['accent'])
container = Gtk.Box(
orientation=0,
spacing=5
)
container.append(self.label)
container.append(self.spinner)
container.append(self.indicator)
super().__init__(
css_classes = ["chat_row"],
height_request = 45,
child = container
)
self.gesture = Gtk.GestureClick(button=3)
self.gesture.connect("released", self.chat_click_handler)
self.add_controller(self.gesture)
def chat_click_handler(self, gesture, n_press, x, y):
chat_row = gesture.get_widget()
popover = Gtk.PopoverMenu(
menu_model=window.chat_right_click_menu,
has_arrow=False,
halign=1,
height_request=155
)
window.selected_chat_row = chat_row
position = Gdk.Rectangle()
position.x = x
position.y = y
popover.set_parent(chat_row.get_child())
popover.set_pointing_to(position)
popover.popup()
class chat_list(Gtk.ListBox):
__gtype_name__ = 'AlpacaChatList'
def __init__(self):
super().__init__(
selection_mode=1,
css_classes=["navigation-sidebar"]
)
self.connect("row-selected", lambda listbox, row: self.chat_changed(row))
self.tab_list = []
def update_welcome_screens(self, show_prompts:bool):
for tab in self.tab_list:
if tab.chat_window.welcome_screen:
tab.chat_window.show_welcome_screen(show_prompts)
def get_tab_by_name(self, chat_name:str) -> chat_tab:
for tab in self.tab_list:
if tab.chat_window.get_name() == chat_name:
return tab
def get_chat_by_name(self, chat_name:str) -> chat:
tab = self.get_tab_by_name(chat_name)
if tab:
return tab.chat_window
def get_current_chat(self) -> chat:
row = self.get_selected_row()
if row:
return self.get_selected_row().chat_window
def send_tab_to_top(self, tab:chat_tab):
self.unselect_all()
self.tab_list.remove(tab)
self.tab_list.insert(0, tab)
self.remove(tab)
self.prepend(tab)
self.select_row(tab)
def append_chat(self, chat_name:str) -> chat:
chat_name = window.generate_numbered_name(chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
chat_window = chat(chat_name)
tab = chat_tab(chat_window)
self.append(tab)
self.tab_list.append(tab)
window.chat_stack.add_child(chat_window)
return chat_window
def prepend_chat(self, chat_name:str) -> chat:
chat_name = window.generate_numbered_name(chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
chat_window = chat(chat_name)
tab = chat_tab(chat_window)
self.prepend(tab)
self.tab_list.insert(0, tab)
chat_window.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
window.chat_stack.add_child(chat_window)
window.chat_list_box.select_row(tab)
return chat_window
def new_chat(self):
window.save_history(self.prepend_chat(_("New Chat")))
def delete_chat(self, chat_name:str):
chat_tab = None
for c in self.tab_list:
if c.chat_window.get_name() == chat_name:
chat_tab = c
if chat_tab:
chat_tab.chat_window.stop_message()
window.chat_stack.remove(chat_tab.chat_window)
self.tab_list.remove(chat_tab)
self.remove(chat_tab)
if os.path.exists(os.path.join(data_dir, "chats", chat_name)):
shutil.rmtree(os.path.join(data_dir, "chats", chat_name))
if len(self.tab_list) == 0:
self.new_chat()
if not self.get_current_chat() or self.get_current_chat() == chat_tab.chat_window:
self.select_row(self.get_row_at_index(0))
window.save_history()
def rename_chat(self, old_chat_name:str, new_chat_name:str):
if new_chat_name == old_chat_name:
return
tab = self.get_tab_by_name(old_chat_name)
if tab:
new_chat_name = window.generate_numbered_name(new_chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
tab.label.set_label(new_chat_name)
tab.label.set_tooltip_text(new_chat_name)
tab.chat_window.set_name(new_chat_name)
if os.path.exists(os.path.join(data_dir, "chats", old_chat_name)):
shutil.move(os.path.join(data_dir, "chats", old_chat_name), os.path.join(data_dir, "chats", new_chat_name))
window.save_history(tab.chat_window)
def duplicate_chat(self, chat_name:str):
new_chat_name = window.generate_numbered_name(_("Copy of {}").format(chat_name), [tab.chat_window.get_name() for tab in self.tab_list])
try:
shutil.copytree(os.path.join(data_dir, "chats", chat_name), os.path.join(data_dir, "chats", new_chat_name))
except Exception as e:
logger.error(e)
self.prepend_chat(new_chat_name)
created_chat = self.get_tab_by_name(new_chat_name).chat_window
created_chat.load_chat_messages(self.get_tab_by_name(chat_name).chat_window.messages_to_dict())
window.save_history(created_chat)
def on_replace_contents(self, file, result):
file.replace_contents_finish(result)
window.show_toast(_("Chat exported successfully"), window.main_overlay)
def on_export_chat(self, file_dialog, result, chat_name):
file = file_dialog.save_finish(result)
if not file:
return
json_data = json.dumps({chat_name: {"messages": self.get_chat_by_name(chat_name).messages_to_dict()}}, indent=4).encode("UTF-8")
with tempfile.TemporaryDirectory() as temp_dir:
json_path = os.path.join(temp_dir, "data.json")
with open(json_path, "wb") as json_file:
json_file.write(json_data)
tar_path = os.path.join(temp_dir, chat_name)
with tarfile.open(tar_path, "w") as tar:
tar.add(json_path, arcname="data.json")
directory = os.path.join(data_dir, "chats", chat_name)
if os.path.exists(directory) and os.path.isdir(directory):
tar.add(directory, arcname=os.path.basename(directory))
with open(tar_path, "rb") as tar:
tar_content = tar.read()
file.replace_contents_async(
tar_content,
etag=None,
make_backup=False,
flags=Gio.FileCreateFlags.NONE,
cancellable=None,
callback=self.on_replace_contents
)
def export_chat(self, chat_name:str):
logger.info("Exporting chat")
file_dialog = Gtk.FileDialog(initial_name=f"{chat_name}.tar")
file_dialog.save(parent=window, cancellable=None, callback=lambda file_dialog, result, chat_name=chat_name: self.on_export_chat(file_dialog, result, chat_name))
def on_chat_imported(self, file_dialog, result):
file = file_dialog.open_finish(result)
if not file:
return
stream = file.read(None)
data_stream = Gio.DataInputStream.new(stream)
tar_content = data_stream.read_bytes(1024 * 1024, None)
with tempfile.TemporaryDirectory() as temp_dir:
tar_filename = os.path.join(temp_dir, "imported_chat.tar")
with open(tar_filename, "wb") as tar_file:
tar_file.write(tar_content.get_data())
with tarfile.open(tar_filename, "r") as tar:
tar.extractall(path=temp_dir)
chat_name = None
chat_content = None
for member in tar.getmembers():
if member.name == "data.json":
json_filepath = os.path.join(temp_dir, member.name)
with open(json_filepath, "r", encoding="utf-8") as json_file:
data = json.load(json_file)
for chat_name, chat_content in data.items():
new_chat_name = window.generate_numbered_name(chat_name, [tab.chat_window.get_name() for tab in self.tab_list])
src_path = os.path.join(temp_dir, chat_name)
dest_path = os.path.join(data_dir, "chats", new_chat_name)
if os.path.exists(src_path) and os.path.isdir(src_path) and not os.path.exists(dest_path):
shutil.copytree(src_path, dest_path)
created_chat = self.prepend_chat(new_chat_name)
created_chat.load_chat_messages(chat_content['messages'])
window.save_history(created_chat)
window.show_toast(_("Chat imported successfully"), window.main_overlay)
def import_chat(self):
logger.info("Importing chat")
file_dialog = Gtk.FileDialog(default_filter=window.file_filter_tar)
file_dialog.open(window, None, self.on_chat_imported)
def chat_changed(self, row):
if row:
current_tab_i = next((i for i, t in enumerate(self.tab_list) if t.chat_window == window.chat_stack.get_visible_child()), -1)
if self.tab_list.index(row) != current_tab_i:
if window.searchentry_messages.get_text() != '':
window.searchentry_messages.set_text('')
window.message_search_changed(window.searchentry_messages, window.chat_stack.get_visible_child())
window.message_searchbar.set_search_mode(False)
window.chat_stack.set_transition_type(4 if self.tab_list.index(row) > current_tab_i else 5)
window.chat_stack.set_visible_child(row.chat_window)
window.switch_send_stop_button(not row.chat_window.busy)
if len(row.chat_window.messages) > 0:
last_model_used = row.chat_window.messages[list(row.chat_window.messages)[-1]].model
window.model_manager.change_model(last_model_used)
if row.indicator.get_visible():
row.indicator.set_visible(False)

View File

@ -1,173 +0,0 @@
#dialog_widget.py
"""
Handles all dialogs
"""
import gi
gi.require_version('Gtk', '4.0')
gi.require_version('GtkSource', '5')
from gi.repository import Gtk, Gio, Adw, Gdk, GLib
window=None
button_appearance={
'suggested': Adw.ResponseAppearance.SUGGESTED,
'destructive': Adw.ResponseAppearance.DESTRUCTIVE
}
# Don't call this directly outside this script
class baseDialog(Adw.AlertDialog):
__gtype_name__ = 'AlpacaDialogBase'
def __init__(self, heading:str, body:str, close_response:str, options:dict):
self.options = options
super().__init__(
heading=heading,
body=body,
close_response=close_response
)
for option, data in self.options.items():
self.add_response(option, option)
if 'appearance' in data:
self.set_response_appearance(option, button_appearance[data['appearance']])
if 'default' in data and data['default']:
self.set_default_response(option)
class Options(baseDialog):
__gtype_name__ = 'AlpacaDialogOptions'
def __init__(self, heading:str, body:str, close_response:str, options:dict):
super().__init__(
heading,
body,
close_response,
options
)
self.choose(
parent = window,
cancellable = None,
callback = self.response
)
def response(self, dialog, task):
result = dialog.choose_finish(task)
if result in self.options and 'callback' in self.options[result]:
self.options[result]['callback']()
class Entry(baseDialog):
__gtype_name__ = 'AlpacaDialogEntry'
def __init__(self, heading:str, body:str, close_response:str, options:dict, entries:list or dict):
super().__init__(
heading,
body,
close_response,
options
)
self.container = Gtk.Box(
orientation=1,
spacing=10
)
if isinstance(entries, dict):
entries = [entries]
for data in entries:
entry = Gtk.Entry()
if 'placeholder' in data and data['placeholder']:
entry.set_placeholder_text(data['placeholder'])
if 'css' in data and data['css']:
entry.set_css_classes(data['css'])
if 'text' in data and data['text']:
entry.set_text(data['text'])
self.container.append(entry)
self.set_extra_child(self.container)
self.connect('realize', lambda *_: list(self.container)[0].grab_focus())
self.choose(
parent = window,
cancellable = None,
callback = self.response
)
def response(self, dialog, task):
result = dialog.choose_finish(task)
if result in self.options and 'callback' in self.options[result]:
entry_results = []
for entry in list(self.container):
entry_results.append(entry.get_text())
self.options[result]['callback'](*entry_results)
class DropDown(baseDialog):
__gtype_name__ = 'AlpacaDialogDropDown'
def __init__(self, heading:str, body:str, close_response:str, options:dict, items:list):
super().__init__(
heading,
body,
close_response,
options
)
string_list = Gtk.StringList()
for item in items:
string_list.append(item)
self.set_extra_child(Gtk.DropDown(
enable_search=len(items) > 10,
model=string_list
))
self.connect('realize', lambda *_: self.get_extra_child().grab_focus())
self.choose(
parent = window,
cancellable = None,
callback = lambda dialog, task, dropdown=self.get_extra_child(): self.response(dialog, task, dropdown.get_selected_item().get_string())
)
def response(self, dialog, task, item:str):
result = dialog.choose_finish(task)
if result in self.options and 'callback' in self.options[result]:
self.options[result]['callback'](item)
def simple(heading:str, body:str, callback:callable, button_name:str=_('Accept'), button_appearance:str='suggested'):
options = {
_('Cancel'): {},
button_name: {
'appearance': button_appearance,
'callback': callback,
'default': True
}
}
return Options(heading, body, 'cancel', options)
def simple_entry(heading:str, body:str, callback:callable, entries:list or dict, button_name:str=_('Accept'), button_appearance:str='suggested'):
options = {
_('Cancel'): {},
button_name: {
'appearance': button_appearance,
'callback': callback,
'default': True
}
}
return Entry(heading, body, 'cancel', options, entries)
def simple_dropdown(heading:str, body:str, callback:callable, items:list, button_name:str=_('Accept'), button_appearance:str='suggested'):
options = {
_('Cancel'): {},
button_name: {
'appearance': button_appearance,
'callback': callback,
'default': True
}
}
return DropDown(heading, body, 'cancel', options, items)
def simple_file(file_filter:Gtk.FileFilter, callback:callable):
file_dialog = Gtk.FileDialog(default_filter=file_filter)
file_dialog.open(window, None, lambda file_dialog, result: callback(file_dialog.open_finish(result)) if result else None)

View File

@ -1,634 +0,0 @@
#message_widget.py
"""
Handles the message widget (testing)
"""
import gi
gi.require_version('Gtk', '4.0')
gi.require_version('GtkSource', '5')
from gi.repository import Gtk, GObject, Gio, Adw, GtkSource, GLib, Gdk
import logging, os, datetime, re, shutil, threading, sys
from ..internal import config_dir, data_dir, cache_dir, source_dir
from .table_widget import TableWidget
from . import dialog_widget, terminal_widget
logger = logging.getLogger(__name__)
window = None
class edit_text_block(Gtk.Box):
__gtype_name__ = 'AlpacaEditTextBlock'
def __init__(self, text:str):
super().__init__(
hexpand=True,
halign=0,
margin_top=5,
margin_bottom=5,
margin_start=5,
margin_end=5,
spacing=5,
orientation=1
)
self.text_view = Gtk.TextView(
halign=0,
hexpand=True,
css_classes=["view", "editing_message_textview"],
wrap_mode=3
)
cancel_button = Gtk.Button(
vexpand=False,
valign=2,
halign=2,
tooltip_text=_("Cancel"),
css_classes=['flat', 'circular'],
icon_name='cross-large-symbolic'
)
cancel_button.connect('clicked', lambda *_: self.cancel_edit())
save_button = Gtk.Button(
vexpand=False,
valign=2,
halign=2,
tooltip_text=_("Save Message"),
css_classes=['flat', 'circular'],
icon_name='paper-plane-symbolic'
)
save_button.connect('clicked', lambda *_: self.edit_message())
self.append(self.text_view)
button_container = Gtk.Box(
halign=2,
spacing=5
)
button_container.append(cancel_button)
button_container.append(save_button)
self.append(button_container)
self.text_view.get_buffer().insert(self.text_view.get_buffer().get_start_iter(), text, len(text.encode('utf-8')))
key_controller = Gtk.EventControllerKey.new()
key_controller.connect("key-pressed", self.handle_key)
self.text_view.add_controller(key_controller)
def handle_key(self, controller, keyval, keycode, state):
if keyval==Gdk.KEY_Return and not (state & Gdk.ModifierType.SHIFT_MASK):
self.save_edit()
return True
elif keyval==Gdk.KEY_Escape:
self.cancel_edit()
return True
def save_edit(self):
message_element = self.get_parent().get_parent()
message_element.action_buttons.set_visible(True)
message_element.set_text(self.text_view.get_buffer().get_text(self.text_view.get_buffer().get_start_iter(), self.text_view.get_buffer().get_end_iter(), False))
message_element.add_footer(message_element.dt)
window.save_history(message_element.get_parent().get_parent().get_parent().get_parent())
self.get_parent().remove(self)
window.show_toast(_("Message edited successfully"), window.main_overlay)
def cancel_edit(self):
message_element = self.get_parent().get_parent()
message_element.action_buttons.set_visible(True)
message_element.set_text(message_element.text)
message_element.add_footer(message_element.dt)
self.get_parent().remove(self)
class text_block(Gtk.Label):
__gtype_name__ = 'AlpacaTextBlock'
def __init__(self, bot:bool):
super().__init__(
hexpand=True,
halign=0,
wrap=True,
wrap_mode=0,
xalign=0,
margin_top=5,
margin_start=5,
margin_end=5,
focusable=True,
selectable=True
)
self.update_property([4, 7], [_("Response message") if bot else _("User message"), False])
self.connect('notify::has-focus', lambda *_: GLib.idle_add(self.remove_selection) if self.has_focus() else None)
def remove_selection(self):
self.set_selectable(False)
self.set_selectable(True)
def insert_at_end(self, text:str, markdown:bool):
if markdown:
self.set_markup(self.get_text() + text)
else:
self.set_text(self.get_text() + text)
self.update_property([1], [self.get_text()])
def clear_text(self):
self.buffer.delete(self.textbuffer.get_start_iter(), self.textbuffer.get_end_iter())
self.update_property([1], [""])
class code_block(Gtk.Box):
__gtype_name__ = 'AlpacaCodeBlock'
def __init__(self, text:str, language_name:str=None):
super().__init__(
css_classes=["card", "code_block"],
orientation=1,
overflow=1,
margin_start=5,
margin_end=5
)
self.language = None
if language_name:
self.language = GtkSource.LanguageManager.get_default().get_language(language_name)
if self.language:
self.buffer = GtkSource.Buffer.new_with_language(self.language)
else:
self.buffer = GtkSource.Buffer()
self.buffer.set_style_scheme(GtkSource.StyleSchemeManager.get_default().get_scheme('Adwaita-dark'))
self.source_view = GtkSource.View(
auto_indent=True, indent_width=4, buffer=self.buffer, show_line_numbers=True, editable=None,
top_margin=6, bottom_margin=6, left_margin=12, right_margin=12, css_classes=["code_block"]
)
self.source_view.update_property([4], [_("{}Code Block").format('{} '.format(self.language.get_name()) if self.language else "")])
title_box = Gtk.Box(margin_start=12, margin_top=3, margin_bottom=3, margin_end=3)
title_box.append(Gtk.Label(label=self.language.get_name() if self.language else (language_name.title() if language_name else _("Code Block")), hexpand=True, xalign=0))
copy_button = Gtk.Button(icon_name="edit-copy-symbolic", css_classes=["flat", "circular"], tooltip_text=_("Copy Message"))
copy_button.connect("clicked", lambda *_: self.on_copy())
title_box.append(copy_button)
if language_name and language_name.lower() in ['bash', 'python3']:
run_button = Gtk.Button(icon_name="execute-from-symbolic", css_classes=["flat", "circular"], tooltip_text=_("Run Script"))
run_button.connect("clicked", lambda *_: self.run_script(language_name))
title_box.append(run_button)
self.append(title_box)
self.append(Gtk.Separator())
self.append(self.source_view)
self.buffer.set_text(text)
def on_copy(self):
logger.debug("Copying code")
clipboard = Gdk.Display().get_default().get_clipboard()
start = self.buffer.get_start_iter()
end = self.buffer.get_end_iter()
text = self.buffer.get_text(start, end, False)
clipboard.set(text)
window.show_toast(_("Code copied to the clipboard"), window.main_overlay)
def run_script(self, language_name):
logger.debug("Running script")
start = self.buffer.get_start_iter()
end = self.buffer.get_end_iter()
dialog_widget.simple(
_('Run Script'),
_('Make sure you understand what this script does before running it, Alpaca is not responsible for any damages to your device or data'),
lambda script=self.buffer.get_text(start, end, False), language_name=language_name: terminal_widget.run_terminal(script, language_name),
_('Execute'),
'destructive'
)
class attachment(Gtk.Button):
__gtype_name__ = 'AlpacaAttachment'
def __init__(self, file_name:str, file_path:str, file_type:str):
self.file_name = file_name
self.file_path = file_path
self.file_type = file_type
directory, file_name = os.path.split(self.file_path)
head, last_dir = os.path.split(directory)
head, second_last_dir = os.path.split(head)
self.file_path = os.path.join(head, '{selected_chat}', last_dir, file_name)
button_content = Adw.ButtonContent(
label=self.file_name,
icon_name={
"plain_text": "document-text-symbolic",
"pdf": "document-text-symbolic",
"youtube": "play-symbolic",
"website": "globe-symbolic"
}[self.file_type]
)
super().__init__(
vexpand=False,
valign=3,
name=self.file_name,
css_classes=["flat"],
tooltip_text=self.file_name,
child=button_content
)
self.connect("clicked", lambda button, file_path=self.file_path, file_type=self.file_type: window.preview_file(file_path, file_type, None))
class attachment_container(Gtk.ScrolledWindow):
__gtype_name__ = 'AlpacaAttachmentContainer'
def __init__(self):
self.files = []
self.container = Gtk.Box(
orientation=0,
spacing=10,
valign=1
)
super().__init__(
margin_top=10,
margin_start=10,
margin_end=10,
hexpand=True,
child=self.container,
vscrollbar_policy=2
)
def add_file(self, file:attachment):
self.container.append(file)
self.files.append(file)
class image(Gtk.Button):
__gtype_name__ = 'AlpacaImage'
def __init__(self, image_path:str):
self.image_path = image_path
self.image_name = os.path.basename(self.image_path)
directory, file_name = os.path.split(self.image_path)
head, last_dir = os.path.split(directory)
head, second_last_dir = os.path.split(head)
try:
if not os.path.isfile(self.image_path):
raise FileNotFoundError("'{}' was not found or is a directory".format(self.image_path))
image = Gtk.Image.new_from_file(self.image_path)
image.set_size_request(240, 240)
super().__init__(
child=image,
css_classes=["flat", "chat_image_button"],
name=self.image_name,
tooltip_text=_("Image")
)
image.update_property([4], [_("Image")])
except Exception as e:
logger.error(e)
image_texture = Gtk.Image.new_from_icon_name("image-missing-symbolic")
image_texture.set_icon_size(2)
image_texture.set_vexpand(True)
image_texture.set_pixel_size(120)
image_label = Gtk.Label(
label=_("Missing Image"),
)
image_box = Gtk.Box(
spacing=10,
orientation=1,
margin_top=10,
margin_bottom=10,
margin_start=10,
margin_end=10
)
image_box.append(image_texture)
image_box.append(image_label)
image_box.set_size_request(220, 220)
super().__init__(
child=image_box,
css_classes=["flat", "chat_image_button"],
tooltip_text=_("Missing Image")
)
image_texture.update_property([4], [_("Missing image")])
self.set_overflow(1)
self.connect("clicked", lambda button, file_path=os.path.join(head, '{selected_chat}', last_dir, file_name): window.preview_file(file_path, 'image', None))
class image_container(Gtk.ScrolledWindow):
__gtype_name__ = 'AlpacaImageContainer'
def __init__(self):
self.files = []
self.container = Gtk.Box(
orientation=0,
spacing=12
)
super().__init__(
margin_top=10,
margin_start=10,
margin_end=10,
hexpand=True,
height_request = 240,
child=self.container
)
def add_image(self, img:image):
self.container.append(img)
self.files.append(img)
class footer(Gtk.Label):
__gtype_name__ = 'AlpacaMessageFooter'
def __init__(self, dt:datetime.datetime, model:str=None):
super().__init__(
hexpand=False,
halign=0,
wrap=True,
ellipsize=3,
wrap_mode=2,
xalign=0,
margin_bottom=5,
margin_start=5,
focusable=True
)
self.set_markup("<small>{}{}</small>".format((window.convert_model_name(model, 0) + "") if model else "", GLib.markup_escape_text(self.format_datetime(dt))))
def format_datetime(self, dt:datetime) -> str:
date = GLib.DateTime.new(GLib.DateTime.new_now_local().get_timezone(), dt.year, dt.month, dt.day, dt.hour, dt.minute, dt.second)
current_date = GLib.DateTime.new_now_local()
if date.format("%Y/%m/%d") == current_date.format("%Y/%m/%d"):
return date.format("%H:%M %p")
if date.format("%Y") == current_date.format("%Y"):
return date.format("%b %d, %H:%M %p")
return date.format("%b %d %Y, %H:%M %p")
class action_buttons(Gtk.Box):
__gtype_name__ = 'AlpacaActionButtonContainer'
def __init__(self, bot:bool):
super().__init__(
orientation=0,
spacing=6,
margin_end=6,
margin_bottom=6,
valign="end",
halign="end"
)
self.delete_button = Gtk.Button(
icon_name = "user-trash-symbolic",
css_classes = ["flat", "circular"],
tooltip_text = _("Remove Message")
)
self.delete_button.connect('clicked', lambda *_: self.delete_message())
self.append(self.delete_button)
self.copy_button = Gtk.Button(
icon_name = "edit-copy-symbolic",
css_classes = ["flat", "circular"],
tooltip_text = _("Copy Message")
)
self.copy_button.connect('clicked', lambda *_: self.copy_message())
self.append(self.copy_button)
self.regenerate_button = Gtk.Button(
icon_name = "update-symbolic",
css_classes = ["flat", "circular"],
tooltip_text = _("Regenerate Message")
)
self.regenerate_button.connect('clicked', lambda *_: self.regenerate_message())
self.edit_button = Gtk.Button(
icon_name = "edit-symbolic",
css_classes = ["flat", "circular"],
tooltip_text = _("Edit Message")
)
self.edit_button.connect('clicked', lambda *_: self.edit_message())
self.append(self.regenerate_button if bot else self.edit_button)
def delete_message(self):
logger.debug("Deleting message")
chat = self.get_parent().get_parent().get_parent().get_parent().get_parent()
message_id = self.get_parent().message_id
self.get_parent().get_parent().remove(self.get_parent())
if os.path.exists(os.path.join(data_dir, "chats", window.chat_list_box.get_current_chat().get_name(), self.get_parent().message_id)):
shutil.rmtree(os.path.join(data_dir, "chats", window.chat_list_box.get_current_chat().get_name(), self.get_parent().message_id))
del chat.messages[message_id]
window.save_history(chat)
if len(chat.messages) == 0:
chat.show_welcome_screen(len(window.model_manager.get_model_list()) > 0)
def copy_message(self):
logger.debug("Copying message")
clipboard = Gdk.Display().get_default().get_clipboard()
clipboard.set(self.get_parent().text)
window.show_toast(_("Message copied to the clipboard"), window.main_overlay)
def regenerate_message(self):
chat = self.get_parent().get_parent().get_parent().get_parent().get_parent()
message_element = self.get_parent()
if message_element.spinner:
message_element.container.remove(message_element.spinner)
message_element.spinner = None
if not chat.busy:
message_element.set_text()
if message_element.footer:
message_element.container.remove(message_element.footer)
message_element.remove_overlay(self)
message_element.action_buttons = None
history = window.convert_history_to_ollama(chat)[:list(chat.messages).index(message_element.message_id)]
data = {
"model": window.model_manager.get_selected_model(),
"messages": history,
"options": {"temperature": window.ollama_instance.tweaks["temperature"], "seed": window.ollama_instance.tweaks["seed"]},
"keep_alive": f"{window.ollama_instance.tweaks['keep_alive']}m"
}
thread = threading.Thread(target=window.run_message, args=(data, message_element, chat))
thread.start()
else:
window.show_toast(_("Message cannot be regenerated while receiving a response"), window.main_overlay)
def edit_message(self):
logger.debug("Editing message")
self.get_parent().action_buttons.set_visible(False)
for child in self.get_parent().content_children:
self.get_parent().container.remove(child)
self.get_parent().content_children = []
self.get_parent().container.remove(self.get_parent().footer)
self.get_parent().footer = None
edit_text_b = edit_text_block(self.get_parent().text)
self.get_parent().container.append(edit_text_b)
window.set_focus(edit_text_b)
class message(Gtk.Overlay):
__gtype_name__ = 'AlpacaMessage'
def __init__(self, message_id:str, model:str=None):
self.message_id = message_id
self.bot = model != None
self.dt = None
self.model = model
self.action_buttons = None
self.content_children = [] #These are the code blocks, text blocks and tables
self.footer = None
self.image_c = None
self.attachment_c = None
self.spinner = None
self.text = None
self.container = Gtk.Box(
orientation=1,
halign='fill',
css_classes=["response_message"] if self.bot else ["card", "user_message"],
spacing=5,
width_request=-1 if self.bot else 375
)
super().__init__(
css_classes=["message"],
name=message_id,
halign=0 if self.bot else 2
)
self.set_child(self.container)
def add_attachments(self, attachments:dict):
self.attachment_c = attachment_container()
self.container.append(self.attachment_c)
for file_path, file_type in attachments.items():
file = attachment(os.path.basename(file_path), file_path, file_type)
self.attachment_c.add_file(file)
def add_images(self, images:list):
self.image_c = image_container()
self.container.append(self.image_c)
for image_path in images:
image_element = image(image_path)
self.image_c.add_image(image_element)
def add_footer(self, dt:datetime.datetime):
self.dt = dt
self.footer = footer(self.dt, self.model)
self.container.append(self.footer)
def add_action_buttons(self):
if not self.action_buttons:
self.action_buttons = action_buttons(self.bot)
self.add_overlay(self.action_buttons)
if not self.text:
self.action_buttons.set_visible(False)
def update_message(self, data:dict):
chat = self.get_parent().get_parent().get_parent().get_parent()
if chat.busy:
vadjustment = chat.get_vadjustment()
if self.spinner:
self.container.remove(self.spinner)
self.spinner = None
self.content_children[-1].set_visible(True)
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper())
elif vadjustment.get_value() + 50 >= vadjustment.get_upper() - vadjustment.get_page_size():
GLib.idle_add(vadjustment.set_value, vadjustment.get_upper() - vadjustment.get_page_size())
GLib.idle_add(self.content_children[-1].insert_at_end, data['message']['content'], False)
if 'done' in data and data['done']:
window.chat_list_box.get_tab_by_name(chat.get_name()).spinner.set_visible(False)
if window.chat_list_box.get_current_chat().get_name() != chat.get_name():
window.chat_list_box.get_tab_by_name(chat.get_name()).indicator.set_visible(True)
if chat.welcome_screen:
chat.container.remove(chat.welcome_screen)
chat.welcome_screen = None
chat.stop_message()
self.text = self.content_children[-1].get_label()
GLib.idle_add(self.set_text, self.content_children[-1].get_label())
self.dt = datetime.datetime.now()
GLib.idle_add(self.add_footer, self.dt)
window.show_notification(chat.get_name(), self.text[:200] + (self.text[200:] and '...'), Gio.ThemedIcon.new("chat-message-new-symbolic"))
GLib.idle_add(window.save_history, chat)
else:
if self.spinner:
GLib.idle_add(self.container.remove, self.spinner)
self.spinner = None
chat_tab = window.chat_list_box.get_tab_by_name(chat.get_name())
if chat_tab.spinner:
GLib.idle_add(chat_tab.spinner.set_visible, False)
sys.exit()
def set_text(self, text:str=None):
self.text = text
for child in self.content_children:
self.container.remove(child)
self.content_children = []
if text:
self.content_children = []
code_block_pattern = re.compile(r'```(\w*)\n(.*?)\n\s*```', re.DOTALL)
no_language_code_block_pattern = re.compile(r'`(\w*)\n(.*?)\n\s*`', re.DOTALL)
table_pattern = re.compile(r'((\r?\n){2}|^)([^\r\n]*\|[^\r\n]*(\r?\n)?)+(?=(\r?\n){2}|$)', re.MULTILINE)
markup_pattern = re.compile(r'<(b|u|tt|span.*)>(.*?)<\/(b|u|tt|span)>') #heh butt span, I'm so funny
parts = []
pos = 0
# Code blocks
for match in code_block_pattern.finditer(self.text[pos:]):
start, end = match.span()
if pos < start:
normal_text = self.text[pos:start]
parts.append({"type": "normal", "text": normal_text.strip()})
language = match.group(1)
code_text = match.group(2)
parts.append({"type": "code", "text": code_text, "language": 'python3' if language == 'python' else language})
pos = end
for match in no_language_code_block_pattern.finditer(self.text[pos:]):
start, end = match.span()
if pos < start:
normal_text = self.text[pos:start]
parts.append({"type": "normal", "text": normal_text.strip()})
language = match.group(1)
code_text = match.group(2)
parts.append({"type": "code", "text": code_text, "language": None})
pos = end
# Tables
for match in table_pattern.finditer(self.text[pos:]):
start, end = match.span()
if pos < start:
normal_text = self.text[pos:start]
parts.append({"type": "normal", "text": normal_text.strip()})
table_text = match.group(0)
parts.append({"type": "table", "text": table_text})
pos = end
# Text blocks
if pos < len(self.text):
normal_text = self.text[pos:]
if normal_text.strip():
parts.append({"type": "normal", "text": normal_text.strip()})
for part in parts:
if part['type'] == 'normal':
text_b = text_block(self.bot)
part['text'] = part['text'].replace("\n* ", "\n")
part['text'] = re.sub(r'`([^`\n]*?)`', r'<tt>\1</tt>', part['text'])
part['text'] = re.sub(r'\*\*(.*?)\*\*', r'<b>\1</b>', part['text'], flags=re.MULTILINE)
part['text'] = re.sub(r'^#\s+(.*)', r'<span size="x-large">\1</span>', part['text'], flags=re.MULTILINE)
part['text'] = re.sub(r'^##\s+(.*)', r'<span size="large">\1</span>', part['text'], flags=re.MULTILINE)
part['text'] = re.sub(r'_(\((.*?)\)|\d+)', r'<sub>\2\1</sub>', part['text'], flags=re.MULTILINE)
part['text'] = re.sub(r'\^(\((.*?)\)|\d+)', r'<sup>\2\1</sup>', part['text'], flags=re.MULTILINE)
pos = 0
for match in markup_pattern.finditer(part['text']):
start, end = match.span()
if pos < start:
text_b.insert_at_end(part['text'][pos:start], False)
text_b.insert_at_end(match.group(0), True)
pos = end
if pos < len(part['text']):
text_b.insert_at_end(part['text'][pos:], False)
self.content_children.append(text_b)
self.container.append(text_b)
elif part['type'] == 'code':
code_b = code_block(part['text'], part['language'])
self.content_children.append(code_b)
self.container.append(code_b)
elif part['type'] == 'table':
table_w = TableWidget(part['text'])
self.content_children.append(table_w)
self.container.append(table_w)
self.add_action_buttons()
else:
text_b = text_block(self.bot)
text_b.set_visible(False)
self.content_children.append(text_b)
if self.spinner:
self.container.remove(self.spinner)
self.spinner = None
self.spinner = Gtk.Spinner(spinning=True, margin_top=10, margin_bottom=10, hexpand=True)
self.container.append(self.spinner)
self.container.append(text_b)
self.container.queue_draw()

View File

@ -1,673 +0,0 @@
#model_widget.py
"""
Handles the model widget (testing)
"""
import gi
gi.require_version('Gtk', '4.0')
gi.require_version('GtkSource', '5')
from gi.repository import Gtk, GObject, Gio, Adw, GtkSource, GLib, Gdk
import logging, os, datetime, re, shutil, threading, json, sys, glob
from ..internal import config_dir, data_dir, cache_dir, source_dir
from .. import available_models_descriptions
from . import dialog_widget
logger = logging.getLogger(__name__)
window = None
class model_selector_popup(Gtk.Popover):
__gtype_name__ = 'AlpacaModelSelectorPopup'
def __init__(self):
manage_models_button = Gtk.Button(
tooltip_text=_('Manage Models'),
child=Gtk.Label(label=_('Manage Models'), halign=1),
hexpand=True,
css_classes=['manage_models_button', 'flat']
)
manage_models_button.set_action_name("app.manage_models")
manage_models_button.connect("clicked", lambda *_: self.hide())
self.model_list_box = Gtk.ListBox(
css_classes=['navigation-sidebar', 'model_list_box'],
height_request=0
)
container = Gtk.Box(
orientation=1,
spacing=5
)
container.append(self.model_list_box)
container.append(Gtk.Separator())
container.append(manage_models_button)
scroller = Gtk.ScrolledWindow(
max_content_height=300,
propagate_natural_width=True,
propagate_natural_height=True,
child=container
)
super().__init__(
css_classes=['model_popover'],
has_arrow=False,
child=scroller
)
class model_selector_row(Gtk.ListBoxRow):
__gtype_name__ = 'AlpacaModelSelectorRow'
def __init__(self, model_name:str, data:dict):
super().__init__(
child = Gtk.Label(
label=window.convert_model_name(model_name, 0),
halign=1,
hexpand=True
),
halign=0,
hexpand=True,
name=model_name,
tooltip_text=window.convert_model_name(model_name, 0)
)
self.data = data
self.image_recognition = 'projector_info' in self.data
class model_selector_button(Gtk.MenuButton):
__gtype_name__ = 'AlpacaModelSelectorButton'
def __init__(self):
self.popover = model_selector_popup()
self.popover.model_list_box.connect('selected-rows-changed', self.model_changed)
self.popover.model_list_box.connect('row-activated', lambda *_: self.get_popover().hide())
container = Gtk.Box(
orientation=0,
spacing=5
)
self.label = Gtk.Label()
container.append(self.label)
container.append(Gtk.Image.new_from_icon_name("down-symbolic"))
super().__init__(
child=container,
popover=self.popover,
halign=3
)
def change_model(self, model_name:str):
for model_row in list(self.get_popover().model_list_box):
if model_name == model_row.get_name():
self.get_popover().model_list_box.select_row(model_row)
break
def model_changed(self, listbox:Gtk.ListBox):
row = listbox.get_selected_row()
if row:
model_name = row.get_name()
self.label.set_label(window.convert_model_name(model_name, 0))
self.set_tooltip_text(window.convert_model_name(model_name, 0))
elif len(list(listbox)) == 0:
window.title_stack.set_visible_child_name('no_models')
window.model_manager.verify_if_image_can_be_used()
def add_model(self, model_name:str):
data = None
response = window.ollama_instance.request("POST", "api/show", json.dumps({"name": model_name}))
if response.status_code != 200:
logger.error(f"Status code was {response.status_code}")
return
try:
data = json.loads(response.text)
except Exception as e:
logger.error(f"Error fetching 'api - show' info: {str(e)}")
model_row = model_selector_row(model_name, data)
GLib.idle_add(self.get_popover().model_list_box.append, model_row)
GLib.idle_add(self.change_model, model_name)
GLib.idle_add(window.title_stack.set_visible_child_name, 'model_selector')
def remove_model(self, model_name:str):
self.get_popover().model_list_box.remove(next((model for model in list(self.get_popover().model_list_box) if model.get_name() == model_name), None))
self.model_changed(self.get_popover().model_list_box)
window.title_stack.set_visible_child_name('model_selector' if len(window.model_manager.get_model_list()) > 0 else 'no_models')
def clear_list(self):
self.get_popover().model_list_box.remove_all()
class pulling_model(Gtk.ListBoxRow):
__gtype_name__ = 'AlpacaPullingModel'
def __init__(self, model_name:str):
model_label = Gtk.Label(
css_classes=["heading"],
label=model_name.split(":")[0].replace("-", " ").title(),
hexpand=True,
halign=1
)
tag_label = Gtk.Label(
css_classes=["subtitle"],
label=model_name.split(":")[1]
)
self.prc_label = Gtk.Label(
css_classes=["subtitle", "numeric"],
label='50%',
hexpand=True,
halign=2
)
subtitle_box = Gtk.Box(
hexpand=True,
spacing=5,
orientation=0
)
subtitle_box.append(tag_label)
subtitle_box.append(self.prc_label)
self.progress_bar = Gtk.ProgressBar(
valign=2,
show_text=False,
css_classes=["horizontal"],
fraction=.5
)
description_box = Gtk.Box(
hexpand=True,
vexpand=True,
spacing=5,
orientation=1
)
description_box.append(model_label)
description_box.append(subtitle_box)
description_box.append(self.progress_bar)
stop_button = Gtk.Button(
icon_name = "media-playback-stop-symbolic",
vexpand = False,
valign = 3,
css_classes = ["error", "circular"],
tooltip_text = _("Stop Pulling '{}'").format(window.convert_model_name(model_name, 0))
)
stop_button.connect('clicked', lambda *i: dialog_widget.simple(
_('Stop Download?'),
_("Are you sure you want to stop pulling '{}'?").format(window.convert_model_name(self.get_name(), 0)),
self.stop,
_('Stop'),
'destructive'
))
container_box = Gtk.Box(
hexpand=True,
vexpand=True,
spacing=10,
orientation=0,
margin_top=10,
margin_bottom=10,
margin_start=10,
margin_end=10
)
container_box.append(description_box)
container_box.append(stop_button)
super().__init__(
child=container_box,
name=model_name
)
self.error = None
self.digests = []
def stop(self):
if len(list(self.get_parent())) == 1:
self.get_parent().set_visible(False)
self.get_parent().remove(self)
def update(self, data):
if 'digest' in data and data['digest'] not in self.digests:
self.digests.append(data['digest'].replace(':', '-'))
if not self.get_parent():
logger.info("Pulling of '{}' was canceled".format(self.get_name()))
directory = os.path.join(data_dir, '.ollama', 'models', 'blobs')
for digest in self.digests:
files_to_delete = glob.glob(os.path.join(directory, digest + '*'))
for file in files_to_delete:
logger.info("Deleting '{}'".format(file))
try:
os.remove(file)
except Exception as e:
logger.error(f"Can't delete file {file}: {e}")
sys.exit()
if 'error' in data:
self.error = data['error']
if 'total' in data and 'completed' in data:
fraction = round(data['completed'] / data['total'], 4)
GLib.idle_add(self.prc_label.set_label, f"{fraction:05.2%}")
GLib.idle_add(self.progress_bar.set_fraction, fraction)
else:
GLib.idle_add(self.prc_label.set_label, data['status'])
GLib.idle_add(self.progress_bar.pulse)
class pulling_model_list(Gtk.ListBox):
__gtype_name__ = 'AlpacaPullingModelList'
def __init__(self):
super().__init__(
selection_mode=0,
css_classes=["boxed-list"],
visible=False
)
class information_bow(Gtk.Box):
__gtype_name__ = 'AlpacaModelInformationBow'
def __init__(self, title:str, subtitle:str):
self.title = title
self.subtitle = subtitle
title_label = Gtk.Label(
label=self.title,
css_classes=['subtitle', 'caption', 'dim-label'],
hexpand=True,
margin_top=10,
margin_start=0,
margin_end=0
)
subtitle_label = Gtk.Label(
label=self.subtitle if self.subtitle else '(none)',
css_classes=['heading'],
hexpand=True,
margin_bottom=10,
margin_start=0,
margin_end=0
)
super().__init__(
spacing=5,
orientation=1,
css_classes=['card']
)
self.append(title_label)
self.append(subtitle_label)
class local_model(Gtk.ListBoxRow):
__gtype_name__ = 'AlpacaLocalModel'
def __init__(self, model_name:str):
model_title = window.convert_model_name(model_name, 0)
model_label = Gtk.Label(
css_classes=["heading"],
label=model_title.split(" (")[0],
hexpand=True,
halign=1
)
tag_label = Gtk.Label(
css_classes=["subtitle"],
label=model_title.split(" (")[1][:-1],
hexpand=True,
halign=1
)
description_box = Gtk.Box(
hexpand=True,
vexpand=True,
spacing=5,
orientation=1
)
description_box.append(model_label)
description_box.append(tag_label)
info_button = Gtk.Button(
icon_name = "info-outline-symbolic",
vexpand = False,
valign = 3,
css_classes = ["circular"],
tooltip_text = _("Details")
)
info_button.connect('clicked', self.show_information)
delete_button = Gtk.Button(
icon_name = "user-trash-symbolic",
vexpand = False,
valign = 3,
css_classes = ["error", "circular"],
tooltip_text = _("Remove '{}'").format(window.convert_model_name(model_name, 0))
)
delete_button.connect('clicked', lambda *i: dialog_widget.simple(
_('Delete Model?'),
_("Are you sure you want to delete '{}'?").format(model_title),
lambda model_name=model_name: window.model_manager.remove_local_model(model_name),
_('Delete'),
'destructive'
))
container_box = Gtk.Box(
hexpand=True,
vexpand=True,
spacing=10,
orientation=0,
margin_top=10,
margin_bottom=10,
margin_start=10,
margin_end=10
)
container_box.append(description_box)
container_box.append(info_button)
container_box.append(delete_button)
super().__init__(
child=container_box,
name=model_name
)
def show_information(self, button):
model = next((element for element in list(window.model_manager.model_selector.get_popover().model_list_box) if element.get_name() == self.get_name()), None)
model_name = model.get_child().get_label()
window.model_detail_page.set_title(' ('.join(model_name.split(' (')[:-1]))
window.model_detail_page.set_description(' ('.join(model_name.split(' (')[-1:])[:-1])
window.model_detail_create_button.set_name(model_name)
window.model_detail_create_button.set_tooltip_text(_("Create Model Based on '{}'").format(model_name))
details_flow_box = Gtk.FlowBox(
valign=1,
hexpand=True,
vexpand=False,
selection_mode=0,
max_children_per_line=2,
min_children_per_line=1,
margin_top=12,
margin_bottom=12,
margin_start=12,
margin_end=12
)
translation_strings={
'modified_at': _('Modified At'),
'parent_model': _('Parent Model'),
'format': _('Format'),
'family': _('Family'),
'parameter_size': _('Parameter Size'),
'quantization_level': _('Quantization Level')
}
if 'modified_at' in model.data and model.data['modified_at']:
details_flow_box.append(information_bow(
title=translation_strings['modified_at'],
subtitle=datetime.datetime.strptime(':'.join(model.data['modified_at'].split(':')[:2]), '%Y-%m-%dT%H:%M').strftime('%Y-%m-%d %H:%M')
))
for name, value in model.data['details'].items():
if isinstance(value, str):
details_flow_box.append(information_bow(
title=translation_strings[name] if name in translation_strings else name.replace('_', ' ').title(),
subtitle=value
))
window.model_detail_page.set_child(details_flow_box)
window.navigation_view_manage_models.push_by_tag('model_information')
class local_model_list(Gtk.ListBox):
__gtype_name__ = 'AlpacaLocalModelList'
def __init__(self):
super().__init__(
selection_mode=0,
css_classes=["boxed-list"],
visible=False
)
def add_model(self, model_name:str):
model = local_model(model_name)
GLib.idle_add(self.append, model)
if not self.get_visible():
self.set_visible(True)
def remove_model(self, model_name:str):
self.remove(next((model for model in list(self) if model.get_name() == model_name), None))
class available_model(Gtk.ListBoxRow):
__gtype_name__ = 'AlpacaAvailableModel'
def __init__(self, model_name:str, model_author:str, model_description:str, image_recognition:bool):
self.model_description = model_description
self.model_title = model_name.replace("-", " ").title()
self.model_author = model_author
self.image_recognition = image_recognition
model_label = Gtk.Label(
css_classes=["heading"],
label="<b>{}</b> <small>by {}</small>".format(self.model_title, self.model_author),
hexpand=True,
halign=1,
use_markup=True,
wrap=True,
wrap_mode=0
)
description_label = Gtk.Label(
css_classes=["subtitle"],
label=self.model_description,
hexpand=True,
halign=1,
wrap=True,
wrap_mode=0,
)
image_recognition_indicator = Gtk.Button(
css_classes=["success", "pill", "image_recognition_indicator"],
child=Gtk.Label(
label=_("Image Recognition"),
css_classes=["subtitle"]
),
halign=1
)
description_box = Gtk.Box(
hexpand=True,
vexpand=True,
spacing=5,
orientation=1
)
description_box.append(model_label)
description_box.append(description_label)
if self.image_recognition: description_box.append(image_recognition_indicator)
container_box = Gtk.Box(
hexpand=True,
vexpand=True,
spacing=10,
orientation=0,
margin_top=10,
margin_bottom=10,
margin_start=10,
margin_end=10
)
next_icon = Gtk.Image.new_from_icon_name("go-next")
next_icon.update_property([4], [_("Enter download menu for {}").format(self.model_title)])
container_box.append(description_box)
container_box.append(next_icon)
super().__init__(
child=container_box,
name=model_name
)
gesture_click = Gtk.GestureClick.new()
gesture_click.connect("pressed", lambda *_: self.show_pull_menu())
event_controller_key = Gtk.EventControllerKey.new()
event_controller_key.connect("key-pressed", lambda controller, key, *_: self.show_pull_menu() if key in (Gdk.KEY_space, Gdk.KEY_Return) else None)
self.add_controller(gesture_click)
self.add_controller(event_controller_key)
def confirm_pull_model(self, model_name):
threading.Thread(target=window.model_manager.pull_model, args=(model_name,)).start()
window.navigation_view_manage_models.pop()
def show_pull_menu(self):
with open(os.path.join(source_dir, 'available_models.json'), 'r', encoding="utf-8") as f:
data = json.load(f)
window.navigation_view_manage_models.push_by_tag('model_tags_page')
window.navigation_view_manage_models.find_page('model_tags_page').set_title(self.get_name().replace("-", " ").title())
window.model_link_button.set_name(data[self.get_name()]['url'])
window.model_link_button.set_tooltip_text(data[self.get_name()]['url'])
window.model_tag_list_box.remove_all()
tags = data[self.get_name()]['tags']
for tag_data in tags:
if f"{self.get_name()}:{tag_data[0]}" not in window.model_manager.get_model_list():
tag_row = Adw.ActionRow(
title = tag_data[0],
subtitle = tag_data[1],
name = f"{self.get_name()}:{tag_data[0]}"
)
download_icon = Gtk.Image.new_from_icon_name("folder-download-symbolic")
tag_row.add_suffix(download_icon)
download_icon.update_property([4], [_("Download {}:{}").format(self.get_name(), tag_data[0])])
gesture_click = Gtk.GestureClick.new()
gesture_click.connect("pressed", lambda *_, name=f"{self.get_name()}:{tag_data[0]}" : self.confirm_pull_model(name))
event_controller_key = Gtk.EventControllerKey.new()
event_controller_key.connect("key-pressed", lambda controller, key, *_, name=f"{self.get_name()}:{tag_data[0]}" : self.confirm_pull_model(name) if key in (Gdk.KEY_space, Gdk.KEY_Return) else None)
tag_row.add_controller(gesture_click)
tag_row.add_controller(event_controller_key)
window.model_tag_list_box.append(tag_row)
class available_model_list(Gtk.ListBox):
__gtype_name__ = 'AlpacaAvailableModelList'
def __init__(self):
super().__init__(
selection_mode=0,
css_classes=["boxed-list"],
visible=False
)
def add_model(self, model_name:str, model_author:str, model_description:str, image_recognition:bool):
model = available_model(model_name, model_author, model_description, image_recognition)
self.append(model)
if not self.get_visible():
self.set_visible(True)
class model_manager_container(Gtk.Box):
__gtype_name__ = 'AlpacaModelManagerContainer'
def __init__(self):
super().__init__(
margin_top=12,
margin_bottom=12,
margin_start=12,
margin_end=12,
spacing=12,
orientation=1
)
self.pulling_list = pulling_model_list()
self.append(self.pulling_list)
self.local_list = local_model_list()
self.append(self.local_list)
self.available_list = available_model_list()
self.append(self.available_list)
self.model_selector = model_selector_button()
window.title_stack.add_named(self.model_selector, 'model_selector')
def add_local_model(self, model_name:str):
self.local_list.add_model(model_name)
if not self.local_list.get_visible():
self.local_list.set_visible(True)
self.model_selector.add_model(model_name)
def remove_local_model(self, model_name:str):
logger.debug("Deleting model")
response = window.ollama_instance.request("DELETE", "api/delete", json.dumps({"name": model_name}))
if response.status_code == 200:
self.local_list.remove_model(model_name)
self.model_selector.remove_model(model_name)
if len(self.get_model_list()) == 0:
self.local_list.set_visible(False)
window.chat_list_box.update_welcome_screens(False)
window.show_toast(_("Model deleted successfully"), window.manage_models_overlay)
else:
window.manage_models_dialog.close()
window.connection_error()
def get_selected_model(self) -> str:
row = self.model_selector.get_popover().model_list_box.get_selected_row()
if row:
return row.get_name()
def get_model_list(self) -> list:
return [model.get_name() for model in list(self.model_selector.get_popover().model_list_box)]
#Should only be called when the app starts
def update_local_list(self):
try:
response = window.ollama_instance.request("GET", "api/tags")
if response.status_code == 200:
self.model_selector.popover.model_list_box.remove_all()
self.local_list.remove_all()
data = json.loads(response.text)
if len(data['models']) == 0:
self.local_list.set_visible(False)
else:
self.local_list.set_visible(True)
for model in data['models']:
threading.Thread(target=self.add_local_model, args=(model['name'], )).start()
else:
window.connection_error()
except Exception as e:
logger.error(e)
window.connection_error()
window.title_stack.set_visible_child_name('model_selector' if len(window.model_manager.get_model_list()) > 0 else 'no_models')
#window.title_stack.set_visible_child_name('model_selector')
window.chat_list_box.update_welcome_screens(len(self.get_model_list()) > 0)
#Should only be called when the app starts
def update_available_list(self):
with open(os.path.join(source_dir, 'available_models.json'), 'r', encoding="utf-8") as f:
for name, model_info in json.load(f).items():
self.available_list.add_model(name, model_info['author'], available_models_descriptions.descriptions[name], model_info['image'])
def change_model(self, model_name:str):
self.model_selector.change_model(model_name)
def verify_if_image_can_be_used(self):
logger.debug("Verifying if image can be used")
selected = self.model_selector.get_popover().model_list_box.get_selected_row()
if selected and selected.image_recognition:
for name, content in window.attachments.items():
if content['type'] == 'image':
content['button'].set_css_classes(["flat"])
return True
elif selected:
for name, content in window.attachments.items():
if content['type'] == 'image':
content['button'].set_css_classes(["flat", "error"])
def pull_model(self, model_name:str, modelfile:str=None):
if ':' not in model_name:
model_name += ':latest'
if model_name not in [model.get_name() for model in list(self.pulling_list)] and model_name not in [model.get_name() for model in list(self.local_list)]:
logger.info("Pulling model: {}".format(model_name))
model = pulling_model(model_name)
self.pulling_list.append(model)
if not self.pulling_list.get_visible():
GLib.idle_add(self.pulling_list.set_visible, True)
if modelfile:
response = window.ollama_instance.request("POST", "api/create", json.dumps({"name": model_name, "modelfile": modelfile}), lambda data: model.update(data))
else:
response = window.ollama_instance.request("POST", "api/pull", json.dumps({"name": model_name}), lambda data: model.update(data))
if response.status_code == 200 and not model.error:
GLib.idle_add(window.show_notification, _("Task Complete"), _("Model '{}' pulled successfully.").format(model_name), Gio.ThemedIcon.new("emblem-ok-symbolic"))
GLib.idle_add(window.show_toast, _("Model '{}' pulled successfully.").format(model_name), window.manage_models_overlay)
self.add_local_model(model_name)
elif response.status_code == 200:
GLib.idle_add(window.show_notification, _("Pull Model Error"), _("Failed to pull model '{}': {}").format(model_name, model.error), Gio.ThemedIcon.new("dialog-error-symbolic"))
GLib.idle_add(window.show_toast, _("Error pulling '{}': {}").format(model_name, model.error), window.manage_models_overlay)
else:
GLib.idle_add(window.show_notification, _("Pull Model Error"), _("Failed to pull model '{}' due to network error.").format(model_name), Gio.ThemedIcon.new("dialog-error-symbolic"))
GLib.idle_add(window.show_toast, _("Error pulling '{}'").format(model_name), window.manage_models_overlay)
GLib.idle_add(window.manage_models_dialog.close)
GLib.idle_add(window.connection_error)
self.pulling_list.remove(model)
GLib.idle_add(window.chat_list_box.update_welcome_screens, len(self.get_model_list()) > 0)
if len(list(self.pulling_list)) == 0:
GLib.idle_add(self.pulling_list.set_visible, False)

View File

@ -1,132 +0,0 @@
#table_widget.py
"""
Handles the table widget shown in chat responses
"""
import gi
gi.require_version('Gtk', '4.0')
from gi.repository import Gtk, GObject, Gio
import re
class MarkdownTable:
def __init__(self):
self.headers = []
self.rows = Gio.ListStore()
self.alignments = []
def __repr__(self):
table_repr = 'Headers: {}\n'.format(self.headers)
table_repr += 'Alignments: {}\n'.format(self.alignments)
table_repr += 'Rows:\n'
for row in self.rows:
table_repr += ' | '.join(row) + '\n'
return table_repr
class Row(GObject.GObject):
def __init__(self, _values):
super().__init__()
self.values = _values
def get_column_value(self, index):
return self.values[index]
class TableWidget(Gtk.Frame):
__gtype_name__ = 'TableWidget'
def __init__(self, markdown):
super().__init__()
self.set_margin_start(5)
self.set_margin_end(5)
self.table = MarkdownTable()
self.set_halign(Gtk.Align.START)
self.table_widget = Gtk.ColumnView(
show_column_separators=True,
show_row_separators=True,
reorderable=False,
)
scrolled_window = Gtk.ScrolledWindow(
vscrollbar_policy=Gtk.PolicyType.NEVER,
propagate_natural_width=True
)
self.set_child(scrolled_window)
try:
self.parse_markdown_table(markdown)
self.make_table()
scrolled_window.set_child(self.table_widget)
except:
label = Gtk.Label(
label=markdown.lstrip('\n').rstrip('\n'),
selectable=True,
margin_top=6,
margin_bottom=6,
margin_start=6,
margin_end=6
)
scrolled_window.set_child(label)
def parse_markdown_table(self, markdown_text):
# Define regex patterns for matching the table components
header_pattern = r'^\|(.+?)\|$'
separator_pattern = r'^\|(\s*[:-]+:?\s*\|)+$'
row_pattern = r'^\|(.+?)\|$'
# Split the text into lines
lines = markdown_text.strip().split('\n')
# Extract headers
header_match = re.match(header_pattern, lines[0], re.MULTILINE)
if header_match:
headers = [header.strip() for header in header_match.group(1).replace("*", "").split('|') if header.strip()]
self.table.headers = headers
# Extract alignments
separator_match = re.match(separator_pattern, lines[1], re.MULTILINE)
if separator_match:
alignments = []
separator_columns = lines[1].replace(" ", "").split('|')[1:-1]
for sep in separator_columns:
if ':' in sep:
if sep.startswith('-') and sep.endswith(':'):
alignments.append(1)
elif sep.startswith(':') and sep.endswith('-'):
alignments.append(0)
else:
alignments.append(0.5)
else:
alignments.append(0) # Default alignment is start
self.table.alignments = alignments
# Extract rows
for line in lines[2:]:
row_match = re.match(row_pattern, line, re.MULTILINE)
if row_match:
rows = line.split('|')[1:-1]
row = Row(rows)
self.table.rows.append(row)
def make_table(self):
def _on_factory_setup(_factory, list_item, align):
label = Gtk.Label(xalign=align, ellipsize=3, selectable=True)
list_item.set_child(label)
def _on_factory_bind(_factory, list_item, index):
label_widget = list_item.get_child()
row = list_item.get_item()
label_widget.set_label(row.get_column_value(index))
for index, column_name in enumerate(self.table.headers):
column = Gtk.ColumnViewColumn(title=column_name, expand=True)
factory = Gtk.SignalListItemFactory()
factory.connect("setup", _on_factory_setup, self.table.alignments[index])
factory.connect("bind", _on_factory_bind, index)
column.set_factory(factory)
self.table_widget.append_column(column)
selection = Gtk.NoSelection.new(model=self.table.rows)
self.table_widget.set_model(model=selection)

View File

@ -1,91 +0,0 @@
#chat_widget.py
"""
Handles the terminal widget
"""
import gi
gi.require_version('Gtk', '4.0')
gi.require_version('Vte', '3.91')
from gi.repository import Gtk, Vte, GLib, Pango, GLib, Gdk
import logging, os, shutil, subprocess, re
from ..internal import data_dir
logger = logging.getLogger(__name__)
window = None
class terminal(Vte.Terminal):
__gtype_name__ = 'AlpacaTerminal'
def __init__(self, script:list):
super().__init__(css_classes=["terminal"])
self.set_font(Pango.FontDescription.from_string("Monospace 12"))
self.set_clear_background(False)
pty = Vte.Pty.new_sync(Vte.PtyFlags.DEFAULT, None)
self.set_pty(pty)
pty.spawn_async(
GLib.get_current_dir(),
script,
[],
GLib.SpawnFlags.DEFAULT,
None,
None,
-1,
None,
None
)
key_controller = Gtk.EventControllerKey()
key_controller.connect("key-pressed", self.on_key_press)
self.add_controller(key_controller)
def on_key_press(self, controller, keyval, keycode, state):
ctrl = state & Gdk.ModifierType.CONTROL_MASK
shift = state & Gdk.ModifierType.SHIFT_MASK
if ctrl and keyval == Gdk.KEY_c:
self.copy_clipboard()
return True
return False
def show_terminal(script):
window.terminal_scroller.set_child(terminal(script))
window.terminal_dialog.present(window)
def run_terminal(script:str, language_name:str):
logger.info('Running: \n{}'.format(language_name))
if language_name == 'python3':
if not os.path.isdir(os.path.join(data_dir, 'pyenv')):
os.mkdir(os.path.join(data_dir, 'pyenv'))
with open(os.path.join(data_dir, 'pyenv', 'main.py'), 'w') as f:
f.write(script)
script = [
'echo "🐍 {}\n"'.format(_('Setting up Python environment...')),
'python3 -m venv "{}"'.format(os.path.join(data_dir, 'pyenv')),
'{} {}'.format(os.path.join(data_dir, 'pyenv', 'bin', 'python3').replace(' ', '\\ '), os.path.join(data_dir, 'pyenv', 'main.py').replace(' ', '\\ '))
]
if os.path.isfile(os.path.join(data_dir, 'pyenv', 'requirements.txt')):
script.insert(1, '{} install -r {} | grep -v "already satisfied"; clear'.format(os.path.join(data_dir, 'pyenv', 'bin', 'pip3'), os.path.join(data_dir, 'pyenv', 'requirements.txt')))
else:
with open(os.path.join(data_dir, 'pyenv', 'requirements.txt'), 'w') as f:
f.write('')
script = ';\n'.join(script)
script += '; echo "\n🦙 {}"'.format(_('Script exited'))
if language_name == 'bash':
script = re.sub(r'(?m)^\s*sudo', 'pkexec', script)
if shutil.which('flatpak-spawn') and language_name == 'bash':
sandbox = True
try:
process = subprocess.run(['flatpak-spawn', '--host', 'bash', '-c', 'echo "test"'], check=True)
sandbox = False
except Exception as e:
pass
if sandbox:
script = 'echo "🦙 {}\n";'.format(_('The script is contained inside Flatpak')) + script
show_terminal(['bash', '-c', script])
else:
show_terminal(['flatpak-spawn', '--host', 'bash', '-c', script])
else:
show_terminal(['bash', '-c', script])

View File

@ -1,83 +0,0 @@
#generic_actions.py
"""
Working on organizing the code
"""
import os, requests
from youtube_transcript_api import YouTubeTranscriptApi
from html2text import html2text
from .internal import cache_dir
window = None
def connect_remote(remote_url:str, bearer_token:str):
window.ollama_instance.remote_url=remote_url
window.ollama_instance.bearer_token=bearer_token
window.ollama_instance.remote = True
window.ollama_instance.stop()
window.model_manager.update_local_list()
window.save_server_config()
def attach_youtube(video_title:str, video_author:str, watch_url:str, video_url:str, video_id:str, caption_name:str):
buffer = window.message_text_view.get_buffer()
text = buffer.get_text(buffer.get_start_iter(), buffer.get_end_iter(), False).replace(video_url, "")
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
buffer.insert(buffer.get_start_iter(), text, len(text))
result_text = "{}\n{}\n{}\n\n".format(video_title, video_author, watch_url)
caption_name = caption_name.split(' (')[-1][:-1]
if caption_name.startswith('Translate:'):
available_captions = get_youtube_transcripts(video_id)
original_caption_name = available_captions[0].split(' (')[-1][:-1]
transcript = YouTubeTranscriptApi.list_transcripts(video_id).find_transcript([original_caption_name]).translate(caption_name.split(':')[-1]).fetch()
result_text += '(Auto translated from {})\n'.format(available_captions[0])
else:
transcript = YouTubeTranscriptApi.get_transcript(video_id, languages=[caption_name])
result_text += '\n'.join([t['text'] for t in transcript])
if not os.path.exists(os.path.join(cache_dir, 'tmp/youtube')):
os.makedirs(os.path.join(cache_dir, 'tmp/youtube'))
file_path = os.path.join(os.path.join(cache_dir, 'tmp/youtube'), '{} ({})'.format(video_title.replace('/', ' '), caption_name))
with open(file_path, 'w+', encoding="utf-8") as f:
f.write(result_text)
window.attach_file(file_path, 'youtube')
def get_youtube_transcripts(video_id:str):
return ['{} ({})'.format(t.language, t.language_code) for t in YouTubeTranscriptApi.list_transcripts(video_id)]
def attach_website(url:str):
response = requests.get(url)
if response.status_code == 200:
html = response.text
md = html2text(html)
buffer = window.message_text_view.get_buffer()
textview_text = buffer.get_text(buffer.get_start_iter(), buffer.get_end_iter(), False).replace(url, "")
buffer.delete(buffer.get_start_iter(), buffer.get_end_iter())
buffer.insert(buffer.get_start_iter(), textview_text, len(textview_text))
if not os.path.exists('/tmp/alpaca/websites/'):
os.makedirs('/tmp/alpaca/websites/')
md_name = window.generate_numbered_name('website.md', os.listdir('/tmp/alpaca/websites'))
file_path = os.path.join('/tmp/alpaca/websites/', md_name)
with open(file_path, 'w+', encoding="utf-8") as f:
f.write('{}\n\n{}'.format(url, md))
window.attach_file(file_path, 'website')
else:
window.show_toast(_("An error occurred while extracting text from the website"), window.main_overlay)
def attach_file(file):
file_types = {
"plain_text": ["txt", "md", "html", "css", "js", "py", "java", "json", "xml"],
"image": ["png", "jpeg", "jpg", "webp", "gif"],
"pdf": ["pdf"]
}
extension = file.get_path().split(".")[-1]
file_type = next(key for key, value in file_types.items() if extension in value)
if not file_type:
return
if file_type == 'image' and not window.model_manager.verify_if_image_can_be_used():
window.show_toast(_("Image recognition is only available on specific models"), window.main_overlay)
return
window.attach_file(file.get_path(), file_type)

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 5.976562 2 c 0.546876 0 1 0.453125 1 1 v 10 c 0 0.546875 -0.453124 1 -1 1 h -0.976562 c -1.652344 0 -3 -1.347656 -3 -3 v -6 c 0 -1.652344 1.347656 -3 3 -3 z m -5.976562 3 v 6 c 0 2.765625 2.234375 5 5 5 h 0.976562 c 1.660157 0 3 -1.339844 3 -3 v -10 c 0 -1.660156 -1.339843 -3 -3 -3 h -0.976562 c -2.765625 0 -5 2.234375 -5 5 z m 0 0"/><path d="m 1.488281 8.996094 h 1.511719 c 1.101562 0 1.988281 -0.886719 1.988281 -1.984375 v -0.515625 c 0 -0.273438 -0.222656 -0.5 -0.5 -0.5 c -0.273437 0 -0.5 0.226562 -0.5 0.5 v 0.515625 c 0 0.542969 -0.445312 0.984375 -0.988281 0.984375 h -1.511719 c -0.273437 0 -0.5 0.226562 -0.5 0.5 c 0 0.277344 0.226563 0.5 0.5 0.5 z m 0 0"/><path d="m 7.5 9.992188 h -1.511719 c -1.101562 0 -1.988281 0.886718 -1.988281 1.984374 v 0.515626 c 0 0.273437 0.222656 0.5 0.5 0.5 s 0.5 -0.226563 0.5 -0.5 v -0.515626 c 0 -0.539062 0.445312 -0.984374 0.988281 -0.984374 h 1.511719 c 0.277344 0 0.5 -0.226563 0.5 -0.5 c 0 -0.277344 -0.222656 -0.5 -0.5 -0.5 z m 0 0"/><path d="m 11.015625 14 h -1.035156 c -0.546875 0 -1 -0.453125 -1 -1 v -10 c 0 -0.546875 0.453125 -1 1 -1 h 1.035156 v -2 h -1.035156 c -1.664063 0 -3 1.339844 -3 3 v 10 c 0 1.660156 1.335937 3 3 3 h 1.035156 z m 0 0"/><path d="m 10 5 h 2.242188 l 2.148437 -2.6875 l -0.78125 -0.625 l -2 2.5 l 0.390625 -0.1875 h -2 z m 0 0"/><path d="m 10 11 h 2 l -0.390625 -0.1875 l 2 2.5 l 0.78125 -0.625 l -2.148437 -2.6875 h -2.242188 z m 0 0"/><path d="m 14.488281 1.976562 c -0.265625 0 -0.488281 -0.21875 -0.488281 -0.488281 c 0 -0.265625 0.222656 -0.488281 0.488281 -0.488281 c 0.269531 0 0.488281 0.222656 0.488281 0.488281 c 0 0.269531 -0.21875 0.488281 -0.488281 0.488281 z m 0 -1.976562 c -0.824219 0 -1.488281 0.664062 -1.488281 1.488281 s 0.664062 1.488281 1.488281 1.488281 s 1.488281 -0.664062 1.488281 -1.488281 s -0.664062 -1.488281 -1.488281 -1.488281 z m 0 0"/><path d="m 14.488281 13.976562 c -0.265625 0 -0.488281 -0.21875 -0.488281 -0.488281 c 0 -0.265625 0.222656 -0.488281 0.488281 -0.488281 c 0.269531 0 0.488281 0.222656 0.488281 0.488281 c 0 0.269531 -0.21875 0.488281 -0.488281 0.488281 z m 0 -1.976562 c -0.824219 0 -1.488281 0.664062 -1.488281 1.488281 s 0.664062 1.488281 1.488281 1.488281 s 1.488281 -0.664062 1.488281 -1.488281 s -0.664062 -1.488281 -1.488281 -1.488281 z m 0 0"/><path d="m 14.488281 7.976562 c -0.265625 0 -0.488281 -0.21875 -0.488281 -0.488281 c 0 -0.265625 0.222656 -0.488281 0.488281 -0.488281 c 0.269531 0 0.488281 0.222656 0.488281 0.488281 c 0 0.269531 -0.21875 0.488281 -0.488281 0.488281 z m 0 -1.976562 c -0.824219 0 -1.488281 0.664062 -1.488281 1.488281 s 0.664062 1.488281 1.488281 1.488281 s 1.488281 -0.664062 1.488281 -1.488281 s -0.664062 -1.488281 -1.488281 -1.488281 z m 0 0"/></g><path d="m 10 7.53125 h 4" fill="none" stroke="#222222"/><path d="m 4.5 4 h 3 c 0.277344 0 0.5 0.222656 0.5 0.5 s -0.222656 0.5 -0.5 0.5 h -3 c -0.277344 0 -0.5 -0.222656 -0.5 -0.5 s 0.222656 -0.5 0.5 -0.5 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 3.0 KiB

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 2.683594 9.777344 c -1.570313 -0.542969 -2.683594 -2.039063 -2.683594 -3.777344 c 0 -2.199219 1.800781 -4 4 -4 h 3 c 2.199219 0 4 1.800781 4 4 c 0 1.640625 -0.992188 3.070312 -2.421875 3.679688 l -0.785156 -1.839844 c 0.710937 -0.304688 1.207031 -1 1.207031 -1.839844 c 0 -1.125 -0.875 -2 -2 -2 h -3 c -1.125 0 -2 0.875 -2 2 c 0 0.890625 0.558594 1.621094 1.339844 1.890625 z m 0 0"/><path d="m 8 14 c -2.199219 0 -4 -1.800781 -4 -4 c 0 -1.621094 0.96875 -3.03125 2.367188 -3.65625 l 0.816406 1.828125 c -0.699219 0.3125 -1.183594 1 -1.183594 1.828125 c 0 1.125 0.875 2 2 2 h 3 c 1.125 0 2 -0.875 2 -2 c 0 -0.867188 -0.53125 -1.582031 -1.277344 -1.867188 l 0.714844 -1.867187 c 1.503906 0.574219 2.5625 2.039063 2.5625 3.734375 c 0 2.199219 -1.800781 4 -4 4 z m 0 0"/></g></svg>

Before

Width:  |  Height:  |  Size: 934 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 3 0 c -1.644531 0 -3 1.355469 -3 3 v 6 c 0 1.644531 1.355469 3 3 3 h 1 v 4 l 4 -4 h 5 c 1.644531 0 3 -1.355469 3 -3 v -6 c 0 -1.644531 -1.355469 -3 -3 -3 z m 0 2 h 10 c 0.570312 0 1 0.429688 1 1 v 6 c 0 0.570312 -0.429688 1 -1 1 h -10 c -0.570312 0 -1 -0.429688 -1 -1 v -6 c 0 -0.570312 0.429688 -1 1 -1 z m 0 0"/><path d="m 3 3 h 9 v 2 h -9 z m 0 0"/><path d="m 3 6 h 6 v 2 h -6 z m 0 0"/></g></svg>

Before

Width:  |  Height:  |  Size: 556 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 3 1 c -1.644531 0 -3 1.355469 -3 3 v 6 c 0 1.644531 1.355469 3 3 3 h 1 v 3 l 3 -3 v -1 c 0 -0.550781 -0.449219 -1 -1 -1 h -3 c -0.570312 0 -1 -0.429688 -1 -1 v -6 c 0 -0.554688 0.445312 -1 1 -1 h 10 c 0.554688 0 1 0.445312 1 1 v 4 c 0 0.550781 0.449219 1 1 1 s 1 -0.449219 1 -1 v -4 c 0 -1.644531 -1.355469 -3 -3 -3 z m 8 7 v 3 h -3 v 2 h 3 v 3 h 2 v -3 h 3 v -2 h -3 v -3 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 548 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 3 2 c -0.265625 0 -0.519531 0.105469 -0.707031 0.292969 c -0.390625 0.390625 -0.390625 1.023437 0 1.414062 l 4.292969 4.292969 l -4.292969 4.292969 c -0.390625 0.390625 -0.390625 1.023437 0 1.414062 s 1.023437 0.390625 1.414062 0 l 4.292969 -4.292969 l 4.292969 4.292969 c 0.390625 0.390625 1.023437 0.390625 1.414062 0 s 0.390625 -1.023437 0 -1.414062 l -4.292969 -4.292969 l 4.292969 -4.292969 c 0.390625 -0.390625 0.390625 -1.023437 0 -1.414062 c -0.1875 -0.1875 -0.441406 -0.292969 -0.707031 -0.292969 s -0.519531 0.105469 -0.707031 0.292969 l -4.292969 4.292969 l -4.292969 -4.292969 c -0.1875 -0.1875 -0.441406 -0.292969 -0.707031 -0.292969 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 816 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 7.90625 0.09375 c -0.527344 -0.0273438 -1.039062 0.28125 -1.4375 0.96875 l -6.25 11.59375 c -0.535156 0.964844 0.046875 2.34375 1.09375 2.34375 h 13.15625 c 0.980469 0 1.902344 -1.160156 1.21875 -2.34375 l -6.3125 -11.53125 c -0.398438 -0.644531 -0.941406 -1.003906 -1.46875 -1.03125 z m 1.09375 3.90625 v 5 c 0.007812 0.527344 -0.472656 1 -1 1 s -1.007812 -0.472656 -1 -1 v -5 z m -1 7 c 0.550781 0 1 0.449219 1 1 s -0.449219 1 -1 1 s -1 -0.449219 -1 -1 s 0.449219 -1 1 -1 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 649 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 12.277344 0.832031 c -0.578125 0.007813 -1.167969 0.230469 -1.691406 0.753907 l -9 9 c -0.375 0.375 -0.585938 0.882812 -0.585938 1.414062 v 3 h 3 c 0.53125 0 1.039062 -0.210938 1.414062 -0.585938 l 9 -9 c 1.789063 -1.789062 0.082032 -4.390624 -1.890624 -4.570312 c -0.082032 -0.011719 -0.164063 -0.011719 -0.246094 -0.011719 z m -1.777344 3.605469 l 1.0625 1.0625 l -7.0625 7.0625 l -1.0625 -1.0625 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 574 B

View File

@ -1,7 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<g fill="#2e3436">
<path d="m 3 1 c -1.644531 0 -3 1.355469 -3 3 v 8 c 0 1.644531 1.355469 3 3 3 h 8.882812 c 0.832032 0 1.578126 -0.402344 2.054688 -0.9375 c 0.472656 -0.53125 0.738281 -1.167969 0.910156 -1.800781 l 0.972656 -2.609375 c 0.390626 -1.449219 -0.09375 -2.652344 -0.820312 -3.167969 c -0.484375 -0.34375 -0.714844 -0.292969 -1 -0.324219 v -1.160156 c 0 -0.855469 -0.558594 -1.589844 -1.09375 -1.828125 c -0.53125 -0.238281 -1.011719 -0.167969 -1.011719 -0.167969 l 0.105469 -0.003906 h -3.585938 l -1.707031 -1.707031 c -0.1875 -0.1875 -0.441406 -0.292969 -0.707031 -0.292969 z m 0 2 h 2.585938 l 1.707031 1.707031 c 0.1875 0.1875 0.441406 0.292969 0.707031 0.292969 h 4 c 0.035156 0 0.070312 -0.003906 0.105469 -0.007812 c 0 0 0.019531 0.019531 -0.011719 0.003906 c -0.035156 -0.011719 -0.09375 -0.25 -0.09375 0.003906 v 2 c 0 0.550781 0.449219 1 1 1 c 1 0 1.046875 0.703125 0.886719 1.128906 l -0.972657 2.609375 c -0.117187 0.4375 -0.296874 0.800781 -0.472656 0.996094 c -0.175781 0.199219 -0.285156 0.265625 -0.558594 0.265625 h -8.882812 c -0.570312 0 -1 -0.429688 -1 -1 v -8 c 0 -0.570312 0.46875 -0.792969 1 -1 z m 0 0"/>
<path d="m 7 6 l 0.042969 0.003906 c -0.914063 -0.042968 -1.75 0.390625 -2.195313 0.96875 c -0.710937 1.222656 -1.15625 2.277344 -1.800781 3.71875 c -0.171875 0.523438 0.117187 1.089844 0.640625 1.261719 c 0.527344 0.171875 1.09375 -0.117187 1.261719 -0.640625 c 0.488281 -1.011719 0.921875 -1.816406 1.339843 -2.808594 c 0.210938 -0.503906 0.703126 -0.492187 0.898438 -0.503906 h 5.8125 c 0.550781 0 1 -0.449219 1 -1 s -0.449219 -1 -1 -1 z m 0 0"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 1.7 KiB

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 9 2 v 5 h 4 v -1 z m -4 2 v 1 h 3 v -1 z m 0 2 v 1 h 3 v -1 z m 0 2 v 1 h 6 v -1 z m 0 2 v 1 h 6 v -1 z m 0 2 v 1 h 6 v -1 z m 0 0"/><path d="m 2 13 c 0 1.660156 1.339844 3 3 3 h 6 c 1.660156 0 3 -1.339844 3 -3 v -6 c 0 -0.90625 -0.359375 -1.773438 -1 -2.414062 l -2.585938 -2.585938 c -0.640624 -0.640625 -1.507812 -1 -2.414062 -1 h -3 c -1.660156 0 -3 1.339844 -3 3 z m 3 -10 h 3 c 0.375 0 0.734375 0.148438 1 0.414062 l 2.585938 2.585938 c 0.265624 0.265625 0.414062 0.625 0.414062 1 v 6 c 0 0.546875 -0.453125 1 -1 1 h -6 c -0.546875 0 -1 -0.453125 -1 -1 v -9 c 0 -0.546875 0.453125 -1 1 -1 z m 0 0"/></g></svg>

Before

Width:  |  Height:  |  Size: 771 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 2.292969 6.707031 l 5 5 c 0.390625 0.390625 1.023437 0.390625 1.414062 0 l 5 -5 c 0.390625 -0.390625 0.390625 -1.023437 0 -1.414062 s -1.023437 -0.390625 -1.414062 0 l -4.292969 4.292969 l -4.292969 -4.292969 c -0.390625 -0.390625 -1.023437 -0.390625 -1.414062 0 s -0.390625 1.023437 0 1.414062 z m 0 0" fill="#222222" fill-rule="evenodd"/></svg>

Before

Width:  |  Height:  |  Size: 484 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 0 3 c 0 -1.644531 1.355469 -3 3 -3 h 5 c 1.644531 0 3 1.355469 3 3 c 0 0.550781 -0.449219 1 -1 1 s -1 -0.449219 -1 -1 c 0 -0.570312 -0.429688 -1 -1 -1 h -5 c -0.570312 0 -1 0.429688 -1 1 v 5 c 0 0.570312 0.429688 1 1 1 c 0.550781 0 1 0.449219 1 1 s -0.449219 1 -1 1 c -1.644531 0 -3 -1.355469 -3 -3 z m 5 5 c 0 -1.644531 1.355469 -3 3 -3 h 5 c 1.644531 0 3 1.355469 3 3 v 5 c 0 1.644531 -1.355469 3 -3 3 h -5 c -1.644531 0 -3 -1.355469 -3 -3 z m 2 0 v 5 c 0 0.570312 0.429688 1 1 1 h 5 c 0.570312 0 1 -0.429688 1 -1 v -5 c 0 -0.570312 -0.429688 -1 -1 -1 h -5 c -0.570312 0 -1 0.429688 -1 1 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 765 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 6.5 0 c -3.578125 0 -6.5 2.921875 -6.5 6.5 s 2.921875 6.5 6.5 6.5 c 1.429688 0 2.753906 -0.46875 3.828125 -1.257812 l 2.945313 2.945312 c 0.957031 0.9375 2.363281 -0.5 1.40625 -1.4375 l -2.929688 -2.929688 c 0.785156 -1.074218 1.25 -2.394531 1.25 -3.820312 c 0 -3.578125 -2.921875 -6.5 -6.5 -6.5 z m 0 2 c 2.496094 0 4.5 2.003906 4.5 4.5 s -2.003906 4.5 -4.5 4.5 s -4.5 -2.003906 -4.5 -4.5 s 2.003906 -4.5 4.5 -4.5 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 590 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 12.277344 0.832031 c -0.578125 0.007813 -1.167969 0.230469 -1.691406 0.753907 l -9 9 c -0.375 0.375 -0.585938 0.882812 -0.585938 1.414062 v 3 h 3 c 0.53125 0 1.039062 -0.210938 1.414062 -0.585938 l 9 -9 c 1.789063 -1.789062 0.082032 -4.390624 -1.890624 -4.570312 c -0.082032 -0.011719 -0.164063 -0.011719 -0.246094 -0.011719 z m -1.777344 3.605469 l 1.0625 1.0625 l -7.0625 7.0625 l -1.0625 -1.0625 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 568 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 4.992188 2.996094 v 10 h 1 c 0.175781 0 0.347656 -0.039063 0.5 -0.125 l 7 -4 c 0.308593 -0.171875 0.46875 -0.523438 0.46875 -0.875 c 0 -0.351563 -0.160157 -0.703125 -0.46875 -0.875 l -7 -4 c -0.152344 -0.085938 -0.324219 -0.125 -0.5 -0.125 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 409 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 8 0 c -0.550781 0 -1 0.449219 -1 1 v 8.585938 l -2.292969 -2.292969 c -0.1875 -0.1875 -0.441406 -0.292969 -0.707031 -0.292969 s -0.519531 0.105469 -0.707031 0.292969 c -0.390625 0.390625 -0.390625 1.023437 0 1.414062 l 4 4 c 0.390625 0.390625 1.023437 0.390625 1.414062 0 l 4 -4 c 0.390625 -0.390625 0.390625 -1.023437 0 -1.414062 s -1.023437 -0.390625 -1.414062 0 l -2.292969 2.292969 v -8.585938 c 0 -0.550781 -0.449219 -1 -1 -1 z m -6 14 v 2 h 12 v -2 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 630 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 7.5 0 c -4.128906 0 -7.5 3.371094 -7.5 7.5 s 3.371094 7.5 7.5 7.5 s 7.5 -3.371094 7.5 -7.5 s -3.371094 -7.5 -7.5 -7.5 z m 0 2 c 0.257812 0 0.503906 0.023438 0.75 0.054688 c 0.191406 0.261718 0.382812 0.59375 0.550781 1.027343 c 0.105469 0.277344 0.203125 0.585938 0.289063 0.917969 h -3.179688 c 0.085938 -0.332031 0.183594 -0.640625 0.289063 -0.917969 c 0.167969 -0.433593 0.359375 -0.765625 0.550781 -1.027343 c 0.246094 -0.03125 0.492188 -0.054688 0.75 -0.054688 z m -2.085938 0.40625 c -0.050781 0.109375 -0.105468 0.203125 -0.148437 0.316406 c -0.148437 0.386719 -0.269531 0.820313 -0.378906 1.277344 h -1.617188 c 0.570313 -0.691406 1.296875 -1.246094 2.144531 -1.59375 z m 4.171876 0 c 0.847656 0.347656 1.574218 0.902344 2.144531 1.59375 h -1.617188 c -0.109375 -0.457031 -0.230469 -0.890625 -0.378906 -1.277344 c -0.042969 -0.113281 -0.097656 -0.207031 -0.148437 -0.316406 z m -6.980469 2.59375 h 2.082031 c -0.097656 0.628906 -0.148438 1.300781 -0.167969 2 h -2.480469 c 0.0625 -0.714844 0.253907 -1.390625 0.566407 -2 z m 3.09375 0 h 3.601562 c 0.101563 0.617188 0.15625 1.292969 0.179688 2 h -3.960938 c 0.023438 -0.707031 0.078125 -1.382812 0.179688 -2 z m 4.613281 0 h 2.082031 c 0.3125 0.609375 0.503907 1.285156 0.566407 2 h -2.480469 c -0.019531 -0.699219 -0.070313 -1.371094 -0.167969 -2 z m -8.273438 3 h 2.480469 c 0.019531 0.699219 0.070313 1.375 0.167969 2 h -2.082031 c -0.3125 -0.609375 -0.503907 -1.285156 -0.566407 -2 z m 3.480469 0 h 3.960938 c -0.023438 0.707031 -0.078125 1.382812 -0.179688 2 h -3.601562 c -0.101563 -0.617188 -0.15625 -1.292969 -0.179688 -2 z m 4.960938 0 h 2.480469 c -0.0625 0.714844 -0.253907 1.390625 -0.566407 2 h -2.082031 c 0.097656 -0.625 0.148438 -1.300781 0.167969 -2 z m -7.210938 3 h 1.617188 c 0.109375 0.457031 0.230469 0.890625 0.378906 1.273438 c 0.042969 0.117187 0.097656 0.210937 0.148437 0.320312 c -0.847656 -0.347656 -1.574218 -0.902344 -2.144531 -1.59375 z m 2.640625 0 h 3.179688 c -0.085938 0.332031 -0.183594 0.640625 -0.289063 0.917969 c -0.167969 0.433593 -0.359375 0.765625 -0.550781 1.027343 c -0.246094 0.03125 -0.496094 0.054688 -0.75 0.054688 s -0.503906 -0.023438 -0.75 -0.054688 c -0.191406 -0.261718 -0.382812 -0.59375 -0.550781 -1.027343 c -0.105469 -0.277344 -0.203125 -0.585938 -0.289063 -0.917969 z m 4.203125 0 h 1.617188 c -0.570313 0.691406 -1.296875 1.246094 -2.144531 1.59375 c 0.050781 -0.109375 0.105468 -0.203125 0.148437 -0.320312 c 0.148437 -0.382813 0.269531 -0.816407 0.378906 -1.273438 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 2.6 KiB

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 4 1 c -1.644531 0 -3 1.355469 -3 3 v 1 h 1 v -1 c 0 -1.109375 0.890625 -2 2 -2 h 1 v -1 z m 2 0 v 1 h 4 v -1 z m 5 0 v 1 h 1 c 1.109375 0 2 0.890625 2 2 v 1 h 1 v -1 c 0 -1.644531 -1.355469 -3 -3 -3 z m -5 4 c -0.550781 0 -1 0.449219 -1 1 s 0.449219 1 1 1 s 1 -0.449219 1 -1 s -0.449219 -1 -1 -1 z m -5 1 v 4 h 1 v -4 z m 13 0 v 4 h 1 v -4 z m -4.5 2 l -2 2 l -1.5 -1 l -2 2 v 0.5 c 0 0.5 0.5 0.5 0.5 0.5 h 7 s 0.472656 -0.035156 0.5 -0.5 v -1 z m -8.5 3 v 1 c 0 1.644531 1.355469 3 3 3 h 1 v -1 h -1 c -1.109375 0 -2 -0.890625 -2 -2 v -1 z m 13 0 v 1 c 0 1.109375 -0.890625 2 -2 2 h -1 v 1 h 1 c 1.644531 0 3 -1.355469 3 -3 v -1 z m -8 3 v 1 h 4 v -1 z m 0 0" fill="#2e3434" fill-opacity="0.34902"/>
</svg>

Before

Width:  |  Height:  |  Size: 850 B

View File

@ -1,7 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<g fill="#2e3436">
<path d="m 6 5 c -0.550781 0 -1 0.449219 -1 1 s 0.449219 1 1 1 s 1 -0.449219 1 -1 s -0.449219 -1 -1 -1 z m 3.5 3 l -2 2 l -1.5 -1 l -2 2 v 0.5 c 0 0.5 0.5 0.5 0.5 0.5 h 7 s 0.472656 -0.035156 0.5 -0.5 v -1 z m 0 0"/>
<path d="m 4 1 c -1.644531 0 -3 1.355469 -3 3 v 8 c 0 1.644531 1.355469 3 3 3 h 8 c 1.644531 0 3 -1.355469 3 -3 v -8 c 0 -1.644531 -1.355469 -3 -3 -3 z m 0 2 h 8 c 0.570312 0 1 0.429688 1 1 v 8 c 0 0.570312 -0.429688 1 -1 1 h -8 c -0.570312 0 -1 -0.429688 -1 -1 v -8 c 0 -0.570312 0.429688 -1 1 -1 z m 0 0"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 708 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 8 0 c -4.410156 0 -8 3.589844 -8 8 s 3.589844 8 8 8 s 8 -3.589844 8 -8 s -3.589844 -8 -8 -8 z m 0 2 c 3.332031 0 6 2.667969 6 6 s -2.667969 6 -6 6 s -6 -2.667969 -6 -6 s 2.667969 -6 6 -6 z m 0 1.875 c -0.621094 0 -1.125 0.503906 -1.125 1.125 s 0.503906 1.125 1.125 1.125 s 1.125 -0.503906 1.125 -1.125 s -0.503906 -1.125 -1.125 -1.125 z m -1.523438 3.125 c -0.265624 0.011719 -0.476562 0.230469 -0.476562 0.5 c 0 0.277344 0.222656 0.5 0.5 0.5 h 0.5 v 3 h -0.5 c -0.277344 0 -0.5 0.222656 -0.5 0.5 s 0.222656 0.5 0.5 0.5 h 3 c 0.277344 0 0.5 -0.222656 0.5 -0.5 s -0.222656 -0.5 -0.5 -0.5 h -0.5 v -4 h -2.5 c -0.007812 0 -0.015625 0 -0.023438 0 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 813 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 7 1 v 6 h -6 v 2 h 6 v 6 h 2 v -6 h 6 v -2 h -6 v -6 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 228 B

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 3.5 2 h 9 c 0.828125 0 1.5 0.671875 1.5 1.5 v 9 c 0 0.828125 -0.671875 1.5 -1.5 1.5 h -9 c -0.828125 0 -1.5 -0.671875 -1.5 -1.5 v -9 c 0 -0.828125 0.671875 -1.5 1.5 -1.5 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 345 B

View File

@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<g fill="#2e3436">
<path d="m 1 2 h 14 v 2 h -14 z m 0 0"/>
<path d="m 1 7 h 14 v 2 h -14 z m 0 0"/>
<path d="m 1 12 h 14 v 2 h -14 z m 0 0"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 314 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 15 8 l -14 -7 v 6 l 8 1 l -8 1 v 6 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 204 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 4.003906 4.070312 v 7.859376 c 0 1.070312 0.90625 1.066406 0.90625 1.066406 h 0.09375 c 0.171875 0 0.347656 -0.039063 0.5 -0.125 l 7 -4 c 0.308594 -0.171875 0.46875 -0.523438 0.46875 -0.875 c 0 -0.351563 -0.160156 -0.703125 -0.46875 -0.875 l -7 -4 c -0.152344 -0.085938 -0.328125 -0.125 -0.5 -0.125 h -0.09375 s -0.90625 0 -0.90625 1.074218 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 510 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><path d="m 7.5 1.019531 c -0.550781 0 -0.996094 0.445313 -0.996094 0.996094 v 0.453125 c -0.472656 0.128906 -0.929687 0.320312 -1.355468 0.566406 l -0.324219 -0.324218 c -0.390625 -0.390626 -1.019531 -0.390626 -1.410157 0 l -0.703124 0.707031 c -0.390626 0.390625 -0.390626 1.019531 0 1.410156 l 0.320312 0.320313 c -0.246094 0.425781 -0.433594 0.882812 -0.5625 1.355468 h -0.453125 c -0.550781 0 -0.996094 0.445313 -0.996094 0.996094 v 1 c 0 0.550781 0.445313 0.996094 0.996094 0.996094 h 0.449219 c 0.132812 0.472656 0.320312 0.929687 0.566406 1.355468 l -0.320312 0.320313 c -0.390626 0.390625 -0.390626 1.019531 0 1.410156 l 0.703124 0.707031 c 0.390626 0.390626 1.019532 0.390626 1.410157 0 l 0.320312 -0.320312 c 0.429688 0.242188 0.882813 0.433594 1.359375 0.558594 v 0.457031 c 0 0.550781 0.445313 0.996094 0.996094 0.996094 h 0.996094 c 0.554687 0 1 -0.445313 1 -0.996094 v -0.453125 c 0.472656 -0.128906 0.929687 -0.320312 1.355468 -0.566406 l 0.320313 0.324218 c 0.390625 0.390626 1.019531 0.390626 1.410156 0 l 0.707031 -0.707031 c 0.390626 -0.390625 0.390626 -1.019531 0 -1.410156 l -0.320312 -0.320313 c 0.242188 -0.425781 0.433594 -0.882812 0.558594 -1.355468 h 0.453125 c 0.554687 0 1 -0.445313 1 -0.996094 v -1 c 0 -0.550781 -0.445313 -0.996094 -1 -0.996094 h -0.449219 c -0.128906 -0.472656 -0.320312 -0.929687 -0.566406 -1.355468 l 0.324218 -0.320313 c 0.390626 -0.390625 0.390626 -1.019531 0 -1.410156 l -0.707031 -0.707031 c -0.390625 -0.390626 -1.019531 -0.390626 -1.410156 0 l -0.320313 0.320312 c -0.425781 -0.242188 -0.882812 -0.429688 -1.355468 -0.558594 v -0.457031 c 0 -0.550781 -0.445313 -0.996094 -1 -0.996094 z m 0.515625 3.976563 c 1.660156 0 3 1.34375 3 3 s -1.339844 3 -3 3 c -1.65625 0 -3 -1.34375 -3 -3 s 1.34375 -3 3 -3 z m 0 0" fill="#222222"/></svg>

Before

Width:  |  Height:  |  Size: 1.9 KiB

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 4 6 h 1 c 0.257812 0 0.527344 -0.128906 0.71875 -0.3125 l 1.28125 -1.28125 v 6.59375 h 2 v -6.59375 l 1.28125 1.28125 c 0.191406 0.183594 0.410156 0.3125 0.71875 0.3125 h 1 v -1 c 0 -0.308594 -0.089844 -0.550781 -0.28125 -0.75 l -3.71875 -3.65625 l -3.71875 3.65625 c -0.191406 0.199219 -0.28125 0.441406 -0.28125 0.75 z m 0 0"/><path d="m 1.007812 11.972656 c 0 1.664063 1.367188 3.035156 3.03125 3.035156 h 7.917969 c 1.664063 0 3.03125 -1.371093 3.03125 -3.035156 v -1.972656 h -2 v 1.972656 c 0 0.589844 -0.441406 1.035156 -1.03125 1.035156 h -7.917969 c -0.589843 0 -1.03125 -0.445312 -1.03125 -1.035156 v -1.972656 h -2 z m 0 0"/><path d="m 4.039062 6.96875 c -1.664062 0 -3.03125 1.367188 -3.03125 3.03125 v 1.976562 h 2 v -1.976562 c 0 -0.589844 0.441407 -1.03125 1.03125 -1.03125 h 0.960938 v -2 z m 6.960938 0 v 2 h 0.957031 c 0.589844 0 1.03125 0.441406 1.03125 1.03125 v 1.976562 h 2 v -1.976562 c 0 -1.664062 -1.367187 -3.03125 -3.03125 -3.03125 z m 0 0"/></g></svg>

Before

Width:  |  Height:  |  Size: 1.1 KiB

View File

@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<g fill="#2e3436">
<path d="m 6.5 14 v -12 h -5 v 12 z m 0 0" fill-opacity="0.34902"/>
<path d="m 3 1 c -1.644531 0 -3 1.355469 -3 3 v 8 c 0 1.644531 1.355469 3 3 3 h 10 c 1.644531 0 3 -1.355469 3 -3 v -8 c 0 -1.644531 -1.355469 -3 -3 -3 z m 0 2 h 10 c 0.570312 0 1 0.429688 1 1 v 8 c 0 0.570312 -0.429688 1 -1 1 h -10 c -0.570312 0 -1 -0.429688 -1 -1 v -8 c 0 -0.570312 0.429688 -1 1 -1 z m 0 0"/>
<path d="m 6 2 h 1 v 12 h -1 z m 0 0"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 610 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 9 2.933594 c 4.042969 0 5.890625 3.613281 5.890625 3.613281 l -1.785156 0.902344 s -1.246094 -2.515625 -4.105469 -2.515625 c -2.054688 0 -3.097656 1.394531 -3.484375 2.074218 h 0.910156 c 1.332031 0 1.574219 1.253907 1.574219 2.035157 l -6 -0.046875 l -0.046875 -6 c 1.046875 0 2.035156 0.523437 2.035156 1.667968 v 0.929688 l 0.0625 0.0625 c 0.757813 -1.089844 2.3125 -2.722656 4.949219 -2.722656 z m 0 0"/><path d="m 13.617188 8.996094 c 0.683593 0 1.265624 0.582031 1.265624 1.265625 v 3.46875 c 0 0.683593 -0.582031 1.265625 -1.265624 1.265625 h -3.46875 c -0.683594 0 -1.265626 -0.582032 -1.265626 -1.265625 v -3.46875 c 0 -0.683594 0.582032 -1.265625 1.265626 -1.265625 z m -0.734376 2 h -2 v 2 h 2 z m 0 0"/></g></svg>

Before

Width:  |  Height:  |  Size: 881 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 7 2.933594 c -4.042969 0 -5.894531 3.613281 -5.894531 3.613281 l 1.785156 0.902344 s 1.25 -2.515625 4.109375 -2.515625 c 2.054688 0 3.097656 1.394531 3.484375 2.074218 h -0.910156 c -1.332031 0 -1.574219 1.253907 -1.574219 2.035157 l 6 -0.046875 l 0.042969 -6 c -1.042969 0 -2.03125 0.523437 -2.03125 1.667968 v 0.929688 l -0.0625 0.0625 c -0.757813 -1.089844 -2.316407 -2.722656 -4.949219 -2.722656 z m 0 0"/><path d="m 2.382812 8.996094 c -0.683593 0 -1.265624 0.582031 -1.265624 1.265625 v 3.46875 c 0 0.683593 0.582031 1.265625 1.265624 1.265625 h 3.46875 c 0.683594 0 1.265626 -0.582032 1.265626 -1.265625 v -3.46875 c 0 -0.683594 -0.582032 -1.265625 -1.265626 -1.265625 z m 0.734376 2 h 2 v 2 h -2 z m 0 0"/></g></svg>

Before

Width:  |  Height:  |  Size: 880 B

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" height="16px" viewBox="0 0 16 16" width="16px"><g fill="#222222"><path d="m 7.957031 2 c -0.082031 0 -0.164062 0.003906 -0.246093 0.007812 c -0.1875 0.011719 -0.375 0.03125 -0.5625 0.0625 c -1.582032 0.226563 -3.007813 1.070313 -3.96875 2.34375 c -0.804688 1.074219 -1.183594 2.332032 -1.179688 3.585938 h 2.003906 c 0 -0.832031 0.253906 -1.671875 0.796875 -2.398438 c 1.335938 -1.777343 3.820313 -2.113281 5.597657 -0.78125 c 0.429687 0.320313 0.769531 0.734376 1.03125 1.1875 h -1.4375 c -0.550782 0 -1 0.449219 -1 1 v 1 h 6 v -6 h -1 c -0.550782 0 -1 0.449219 -1 1 v 1.6875 c -1.113282 -1.695312 -3.007813 -2.710937 -5.039063 -2.695312 z m 0 0"/><path d="m 8.035156 15.007812 c 0.082032 0 0.164063 -0.003906 0.246094 -0.007812 c 0.1875 -0.011719 0.375 -0.03125 0.5625 -0.0625 c 1.582031 -0.226562 3.007812 -1.066406 3.96875 -2.34375 c 0.804688 -1.074219 1.183594 -2.332031 1.179688 -3.585938 h -2.003907 c -0.003906 0.832032 -0.257812 1.675782 -0.796875 2.398438 c -1.335937 1.777344 -3.820312 2.113281 -5.597656 0.78125 c -0.429688 -0.320312 -0.769531 -0.734375 -1.03125 -1.1875 h 1.4375 c 0.550781 0 1 -0.449219 1 -1 v -1 h -6 v 6 h 1 c 0.550781 0 1 -0.449219 1 -1 v -1.6875 c 1.113281 1.695312 3.007812 2.710938 5.035156 2.695312 z m 0 0"/></g></svg>

Before

Width:  |  Height:  |  Size: 1.3 KiB

View File

@ -1,10 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<g fill="#2e3436">
<path d="m 1 3 h 14 c 0.550781 0 1 0.449219 1 1 s -0.449219 1 -1 1 h -14 c -0.550781 0 -1 -0.449219 -1 -1 s 0.449219 -1 1 -1 z m 0 0"/>
<path d="m 4 4 v -1.5 c 0 -1.386719 1.113281 -2.5 2.5 -2.5 h 2.980469 c 1.382812 0 2.5 1.113281 2.5 2.5 v 1.5 h -2 v -1.5 c 0 -0.269531 -0.230469 -0.5 -0.5 -0.5 h -2.980469 c -0.269531 0 -0.5 0.230469 -0.5 0.5 v 1.5 z m 0 0"/>
<path d="m 4 4 v 9 c 0 0.546875 0.453125 1 1 1 h 6 c 0.546875 0 1 -0.453125 1 -1 v -9 h 2 v 9 c 0 1.660156 -1.339844 3 -3 3 h -6 c -1.660156 0 -3 -1.339844 -3 -3 v -9 z m 0 0"/>
<path d="m 7 7 v 5 c 0 0.277344 -0.222656 0.5 -0.5 0.5 s -0.5 -0.222656 -0.5 -0.5 v -5 c 0 -0.277344 0.222656 -0.5 0.5 -0.5 s 0.5 0.222656 0.5 0.5 z m 0 0"/>
<path d="m 10 7 v 5 c 0 0.277344 -0.222656 0.5 -0.5 0.5 s -0.5 -0.222656 -0.5 -0.5 v -5 c 0 -0.277344 0.222656 -0.5 0.5 -0.5 s 0.5 0.222656 0.5 0.5 z m 0 0"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 1.0 KiB

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg height="16px" viewBox="0 0 16 16" width="16px" xmlns="http://www.w3.org/2000/svg">
<path d="m 7.996094 0 c -1.105469 0 -2 0.894531 -2 2 s 0.894531 2 2 2 c 1.101562 0 2 -0.894531 2 -2 s -0.898438 -2 -2 -2 z m 0 6 c -1.105469 0 -2 0.894531 -2 2 s 0.894531 2 2 2 c 1.101562 0 2 -0.894531 2 -2 s -0.898438 -2 -2 -2 z m 0 6 c -1.105469 0 -2 0.894531 -2 2 s 0.894531 2 2 2 c 1.101562 0 2 -0.894531 2 -2 s -0.898438 -2 -2 -2 z m 0 0" fill="#2e3436"/>
</svg>

Before

Width:  |  Height:  |  Size: 499 B

View File

@ -1,25 +0,0 @@
# internal.py
"""
Handles paths, they can be different if the app is running as a Flatpak
"""
import os
APP_ID = "com.jeffser.Alpaca"
IN_FLATPAK = bool(os.getenv("FLATPAK_ID"))
def get_xdg_home(env, default):
if IN_FLATPAK:
return os.getenv(env)
base = os.getenv(env) or os.path.expanduser(default)
path = os.path.join(base, APP_ID)
if not os.path.exists(path):
os.makedirs(path)
return path
data_dir = get_xdg_home("XDG_DATA_HOME", "~/.local/share")
config_dir = get_xdg_home("XDG_CONFIG_HOME", "~/.config")
cache_dir = get_xdg_home("XDG_CACHE_HOME", "~/.cache")
source_dir = os.path.abspath(os.path.dirname(__file__))

View File

@ -1,6 +1,6 @@
# main.py
#
# Copyright 2024 Jeffser
# Copyright 2024 Unknown
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
@ -16,53 +16,26 @@
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# SPDX-License-Identifier: GPL-3.0-or-later
"""
Main script run at launch, handles actions, about dialog and the app itself (not the window)
"""
import gi
gi.require_version('Gtk', '4.0')
gi.require_version('Adw', '1')
from gi.repository import Gtk, Gio, Adw, GLib
from .window import AlpacaWindow
from .internal import cache_dir, data_dir
import sys
import logging
import os
import gi
logger = logging.getLogger(__name__)
gi.require_version('Gtk', '4.0')
gi.require_version('Adw', '1')
translators = [
'Alex K (Russian) https://github.com/alexkdeveloper',
'Jeffry Samuel (Spanish) https://github.com/jeffser',
'Louis Chauvet-Villaret (French) https://github.com/loulou64490',
'Théo FORTIN (French) https://github.com/topiga',
'Daimar Stein (Brazilian Portuguese) https://github.com/not-a-dev-stein',
'Bruno Antunes (Brazilian Portuguese) https://github.com/antun3s',
'CounterFlow64 (Norwegian) https://github.com/CounterFlow64',
'Aritra Saha (Bengali) https://github.com/olumolu',
'Yuehao Sui (Simplified Chinese) https://github.com/8ar10der',
'Aleksana (Simplified Chinese) https://github.com/Aleksanaa',
'Aritra Saha (Hindi) https://github.com/olumolu',
'YusaBecerikli (Turkish) https://github.com/YusaBecerikli',
'Simon (Ukrainian) https://github.com/OriginalSimon',
'Marcel Margenberg (German) https://github.com/MehrzweckMandala'
]
from gi.repository import Gtk, Gio, Adw
from .window import AlpacaWindow
class AlpacaApplication(Adw.Application):
"""The main application singleton class."""
def __init__(self, version):
def __init__(self):
super().__init__(application_id='com.jeffser.Alpaca',
flags=Gio.ApplicationFlags.DEFAULT_FLAGS)
self.create_action('quit', lambda *_: self.props.active_window.closing_app(None), ['<primary>q'])
self.set_accels_for_action('app.delete_current_chat', ['<primary>w'])
self.create_action('preferences', lambda *_: self.props.active_window.preferences_dialog.present(self.props.active_window), ['<primary>comma'])
self.create_action('quit', lambda *_: self.quit(), ['<primary>q'])
self.create_action('clear', lambda *_: AlpacaWindow.clear_conversation_dialog(self.props.active_window), ['<primary>e'])
self.create_action('reconnect', lambda *_: AlpacaWindow.show_connection_dialog(self.props.active_window), ['<primary>r'])
self.create_action('about', self.on_about_action)
self.set_accels_for_action("win.show-help-overlay", ['<primary>slash'])
self.version = version
def do_activate(self):
win = self.props.active_window
@ -71,22 +44,17 @@ class AlpacaApplication(Adw.Application):
win.present()
def on_about_action(self, widget, _):
about = Adw.AboutDialog(#transient_for=self.props.active_window,
about = Adw.AboutWindow(transient_for=self.props.active_window,
application_name='Alpaca',
application_icon='com.jeffser.Alpaca',
developer_name='Jeffry Samuel Eduarte Rojas',
version=self.version,
support_url="https://github.com/Jeffser/Alpaca/discussions/155",
version='0.4.0',
developers=['Jeffser https://jeffser.com'],
designers=['Jeffser https://jeffser.com', 'Tobias Bernard (App Icon) https://tobiasbernard.com/'],
translator_credits='\n'.join(translators),
copyright='© 2024 Jeffser\n© 2024 Ollama',
issue_url='https://github.com/Jeffser/Alpaca/issues',
license_type=3,
website="https://jeffser.com/alpaca",
debug_info=open(os.path.join(data_dir, 'tmp.log'), 'r').read())
about.add_link("Become a Sponsor", "https://github.com/sponsors/Jeffser")
about.present(parent=self.props.active_window)
designers=['Jeffser https://jeffser.com'],
translator_credits='Alex K (Russian) https://github.com/alexkdeveloper',
copyright='© 2024 Jeffser',
issue_url='https://github.com/Jeffser/Alpaca/issues')
about.present()
def create_action(self, name, callback, shortcuts=None):
action = Gio.SimpleAction.new(name, None)
@ -97,17 +65,5 @@ class AlpacaApplication(Adw.Application):
def main(version):
if os.path.isfile(os.path.join(data_dir, 'tmp.log')):
os.remove(os.path.join(data_dir, 'tmp.log'))
if os.path.isdir(os.path.join(cache_dir, 'tmp')):
os.system('rm -rf ' + os.path.join(cache_dir, "tmp/*"))
else:
os.mkdir(os.path.join(cache_dir, 'tmp'))
logging.basicConfig(
format="%(levelname)s\t[%(filename)s | %(funcName)s] %(message)s",
level=logging.INFO,
handlers=[logging.FileHandler(filename=os.path.join(data_dir, 'tmp.log')), logging.StreamHandler(stream=sys.stdout)]
)
app = AlpacaApplication(version)
logger.info(f"Alpaca version: {app.version}")
app = AlpacaApplication()
return app.run(sys.argv)

View File

@ -26,34 +26,12 @@ configure_file(
install_mode: 'r-xr-xr-x'
)
#configure_file(
#input: 'alpaca_search_provider.in',
#output: 'alpaca_search_provider',
#configuration: conf,
#install: true,
#install_dir: get_option('bindir'),
#install_mode: 'r-xr-xr-x'
#)
alpaca_sources = [
'__init__.py',
'main.py',
'window.py',
'connection_handler.py',
'available_models.json',
'available_models_descriptions.py',
'internal.py',
'generic_actions.py'
]
custom_widgets = [
'custom_widgets/table_widget.py',
'custom_widgets/message_widget.py',
'custom_widgets/chat_widget.py',
'custom_widgets/model_widget.py',
'custom_widgets/terminal_widget.py',
'custom_widgets/dialog_widget.py'
'available_models.py',
]
install_data(alpaca_sources, install_dir: moduledir)
install_data(custom_widgets, install_dir: moduledir / 'custom_widgets')

View File

@ -1,45 +0,0 @@
.message_text_view, .modelfile_textview {
background-color: rgba(0,0,0,0);
}
.chat_image_button {
padding: 0;
}
.chat_image_button, .chat_image_button image {
border-radius: 10px;
}
.editing_message_textview {
border-radius: 5px;
padding: 5px;
}
.model_list_box {
padding: 0;
}
.manage_models_button {
padding: 6px 8px 6px 8px;
font-weight: 400;
}
.model_list_box > * {
margin: 0;
}
.user_message > label, .response_message > label {
padding: 7px;
border-radius: 10px;
}
.user_message label:focus, .response_message label:focus, .editing_message_textview:focus, .code_block:focus {
box-shadow: 0 0 1px 2px mix(@accent_color, @window_bg_color, 0.5);
}
.model_popover {
margin-top: 6px;
}
stacksidebar {
border: none;
}
.image_recognition_indicator {
padding: 0px 10px;
}
.code_block {
font-family: monospace;
}
.terminal {
padding: 10px;
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,30 +0,0 @@
#!/usr/bin/env bash
cd "$(dirname "$0")"
echo "Preparing template..."
xgettext --output=po/alpaca.pot --files-from=po/POTFILES
echo "Updating Spanish..."
msgmerge --no-fuzzy-matching -U po/es.po po/alpaca.pot
echo "Updating Russian..."
msgmerge --no-fuzzy-matching -U po/ru.po po/alpaca.pot
echo "Updating French"
msgmerge --no-fuzzy-matching -U po/fr.po po/alpaca.pot
echo "Updating Brazilian Portuguese"
msgmerge --no-fuzzy-matching -U po/pt_BR.po po/alpaca.pot
echo "Updating Norwegian"
msgmerge --no-fuzzy-matching -U po/nb_NO.po po/alpaca.pot
echo "Updating Bengali"
msgmerge --no-fuzzy-matching -U po/bn.po po/alpaca.pot
echo "Updating Simplified Chinese"
msgmerge --no-fuzzy-matching -U po/zh_Hans.po po/alpaca.pot
echo "Updating Hindi"
msgmerge --no-fuzzy-matching -U po/hi.po po/alpaca.pot
echo "Updating Turkish"
msgmerge --no-fuzzy-matching -U po/tr.po po/alpaca.pot
echo "Updating Ukrainian"
msgmerge --no-fuzzy-matching -U po/uk.po po/alpaca.pot
echo "Updating German"
msgmerge --no-fuzzy-matching -U po/de.po po/alpaca.pot
echo "Updating Hebrew"
msgmerge --no-fuzzy-matching -U po/he.po po/alpaca.pot
echo "Updating Telugu"
msgmerge --no-fuzzy-matching -U po/te.po po/alpaca.pot