Photo by Giammarco Boscaro on Unsplash

Theory

  • Аttention only model without RNNs (LSTM/GRU) is computationally more attractive (parallel rather than sequential processing of input) and even has better performance (ability remember information beyond just about 100+ words) than RNNs.
  • BERT uses an idea of representing…


Photo by freestocks on Unsplash
pip install git+https://github.com/huggingface/transformers@master


Photo by Matt Botsford on Unsplash

Theory


Photo by Samir Bouaked on Unsplash

Theory

  • In some alphabets (Arabic, for example, especially in the cursive form) letters are much harder to locate and recognize.
  • There are many different fonts and styles, some of them make characters look too similar (like the letters I and l or the number 0 and letter O).
  • Handwritten text comes in all shapes and sizes and even the most advanced tools like Google…


Photo by Tyler Nix on Unsplash

Theory


Seq2Seq chatbot connected to Telegram bot

Some theory


Some theory

Recurrent neural network with Gated Recurrent Unit


Some faces, generated by this DCGAN

Some theory



Nikita Schneider

Team Lead & Software Architect

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store