Train ALBERT for natural language processing with TensorFlow on Amazon SageMaker

At re:Invent 2019, AWSsharedthe fastest training times on the cloud for two popular machine learning (ML) models: BERT (natural language processing) and Mask-RCNN (object detection). To train BERT in 1 hour, we efficiently scaled out to 2,048 NVIDIA V100 GPUs by improving the underlying infrastructure, network, and ML framework.Today, we’re open-sourcing the optimized training codefor […]

Growing Anomalies at the Large Hadron Collider Raise Hopes

Recent measurements of particles called B mesons deviate from predictions. Alone, each oddity looks like a fluke, but their collective drift is more suggestive. Computer reconstruction of a collision event in the Large Hadron Collider beauty experiment. The collision produces a B meson, which subsequently decays into other particles that strike LHCb’s detectors. Amid the […]

Top 10 Best FREE Artificial Intelligence Courses

Most of the Machine Learning, Deep Learning, Computer Vision, NLP job positions, or in general every Artificial Intelligence (AI) job position requires you to have at least a bachelor’s degree in Computer Science, Electrical Engineering, or some similar field. If your degree comes from some of the world’s best universities than your chances might be […]

A highly efficient, real-time text to speech system deployed on CPUs

Modern text-to-speech (TTS) systems have come a long way in using neural networks to mimic the nuances of human voice. To generate humanlike audio, one second of speech can require a TTS system to output as many as 24,000 samples — sometimes even more. The size and complexity of state-of-the-art models require massive computation, which […]

The Hateful Memes AI Challenge

We’ve built and are now sharing a data set designed specifically to help AI researchers develop new systems to identify multimodal hate speech. This content combines different modalities, such as text and images, making it difficult for machines to understand. The Hateful Memes data set contains 10,000+ new multimodal examples created by Facebook AI. We […]

Ultimate Guide to Natural Language Processing Courses

Selecting an online course that will match your requirements is very frustrating if you have high standards. Most of them are not comprehensive and a lot of time spent on them is wasted. How would you feel, if someone would provide you a critical path and tell, what modules exactly and in which order will […]

Python 3.9 – The Shape of Things to Come

Python 3.9 is scheduled for 5th October 2020. There is still a long way to go until then, however, with the latest alpha 3.9.0a6 released last week and the beta version is just around the corner, we can already discuss new features. In this article, we explore some features that we have found interesting and […]

Word2Vec: A Comparison Between CBOW, SkipGram & SkipGramSI

Learn how different Word2Vec architectures behave in practice. This is to help you make an informed decision on which architecture to use given the problem you are trying to solve. In this article, we will look at how the different neural network architectures for training a Word2Vec model behave in practice. The idea here is […]

The Dark Secrets Of BERT

BERT stands for Bidirectional Encoder Representations from Transformers. This model is basically a multi-layer bidirectional Transformer encoder(Devlin, Chang, Lee, & Toutanova, 2019), and there are multiple excellent guides about how it works generally, includingthe Illustrated Transformer. What we focus on is one specific component of Transformer architecture known as self-attention. In a nutshell, it is […]