[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
-
Updated
Nov 2, 2023 - Python
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
PyTorch implementation of some attentions for Deep Learning Researchers.
"Attention, Learn to Solve Routing Problems!"[Kool+, 2019], Capacitated Vehicle Routing Problem solver
Exploring attention weights in transformer-based models with linguistic knowledge.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Multi^2OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT (Findings of ACL: EMNLP 2020)
Visualization for simple attention and Google's multi-head attention.
A Faster Pytorch Implementation of Multi-Head Self-Attention
several types of attention modules written in PyTorch
Self-Supervised Vision Transformers for multiplexed imaging datasets
Sentence encoder and training code for Mean-Max AAE
Attention-based Induction Networks for Few-Shot Text Classification
This is the official repository of the original Point Transformer architecture.
Code and Datasets for the paper "A deep learning framework for high-throughput mechanism-driven phenotype compound screening and its application to COVID-19 drug repurposing", published on Nature Machine Intelligence in 2021.
TensorFlow implementation of AlexNet with multi-headed Attention mechanism
EMNLP 2018: Multi-Head Attention with Disagreement Regularization; NAACL 2019: Information Aggregation for Multi-Head Attention with Routing-by-Agreement
Text matching using several deep models.
HydraViT is a PyTorch implementation of the HydraViT model, an adaptive multi-branch transformer for multi-label disease classification from chest X-ray images. The repository provides the necessary code to train and evaluate the HydraViT model on the NIH Chest X-ray dataset.
Add a description, image, and links to the multi-head-attention topic page so that developers can more easily learn about it.
To associate your repository with the multi-head-attention topic, visit your repo's landing page and select "manage topics."