-
FIIT STU
- Lipany, Slovakia
- https://www.linkedin.com/in/plasmoxy/
Highlights
- Pro
Block or Report
Block or report Plasmoxy
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseLists (3)
Sort Name ascending (A-Z)
Stars
Language
Sort by: Recently starred
A free, offline, and openly licensed Japanese-to-English dictionary. Updates weekly!
themoeway / yomitan
Forked from FooSoft/yomichanJapanese pop-up dictionary browser extension. Successor to Yomichan.
Popular Node.js module for parsing JavaScript objects into XML
A SOAP client and server for node.js.
A simple & customizable no-frills Minecraft chat system
Spigot plugin enabling two-way chat messaging between Minecraft and Discord
macOS System-wide Audio Equalizer & Volume Mixer 🎧
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference,…
Collection of film emulation presets for open-source RAW developer software Darktable.
The fastest way to develop full-stack web apps with React & Node.js.
Lists and configuration for our DNS blocking service
A Unified Library for Parameter-Efficient and Modular Transfer Learning
🦜🔗 Build context-aware reasoning applications
adefossez / demucs
Forked from facebookresearch/demucsCode for the paper Hybrid Spectrogram and Waveform Source Separation
A curated list of neural network pruning resources.
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.
Tools for merging pretrained large language models.
SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.
Make huge neural nets fit in memory
OpenBA-V2: 3B LLM (Large Language Model) with T5 architecture, utilizing model pruning technique and continuing pretraining from OpenBA-15B.
Automated Identification of Redundant Layer Blocks for Pruning in Large Language Models