A Gluon implement of Residual Attention Network. Best acc on cifar10-97.78%.
a pytorch code about Residual Attention Network. This code is based on two projects from
Re-implementation of BiDAF(Bidirectional Attention Flow for Machine Comprehension, Minjoon Seo et al., ICLR 2017) on PyTorch.
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
Theano Implementation of DMN+ (Improved Dynamic Memory Networks) from the paper by Xiong, Merity, & Socher at MetaMind, http://arxiv.org/abs/1603.01417 (Dynamic Memory Networks for Visual and Textual Question Answering)
Key-Value Memory Networks for Directly Reading Documents, Alexander Miller, Adam Fisch, Jesse Dodge, Amir-Hossein Karimi, Antoine Bordes, Jason Weston https://arxiv.org/abs/1606.03126
Accompanying code for our COLING 2018 paper "Modeling Semantics with Gated Graph Neural Networks for Knowledge Base Question Answering"
Question Answering as Global Reasoning over Semantic Abstractions (AAAI-18)
This project was developed at Laboratorio de Computación Reconfigurable (LCR) - Universidad Tecnológica Nacional, Mendoza, Argentina. It allows to simulate and synthetize a hardware architecture for extraction of descriptors, based on the FREAK (Fast Retina Keypoint) algorithm. The arquitecture is described using VHDL, ModelSim and ISE Xilinx tools.
Using Verilog to implement the SIFT algorithm into an FPGA for small robotic situations
LBP Library - A Collection of LBP algorithms for background subtraction in videos.