transformers
Recently, I presented a session on End-to-End Object Detection with Transformers at CellStrat AI Lab. Introduction The goal of object detection in computer vision is to identify and locate objects in an image or video. The object detection methods can be divided into two categories as Two stage detectors and Single stage detectors. In two stage […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #WhereLearningNeverStops Last Saturday, AI Researcher Indrajit Singh presented a marvellous workshop on “Bidirectional Encoder Representations from Transformers” – also called “BERT” in short. We are in the era of Pre-trained models in AI. The figure below shows tons of pre-trained models which have taken root. BERT is one of the core models […]
Amazon SageMaker is a fully managed machine learning service. It enables developers to quickly build, train and deploy machine learning models in the cloud. It provides a built-in Jupyter authoring notebook session for easy access to the data sources for exploration and analysis so one doesn’t have to manage the servers. It also provides common […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars Last Saturday, we had amazing presentations by some of our AI Lab members. First Shreyas S K presented an extensive and superb presentation on Anti-Money Laundering with help of Machine Learning. Money Laundering refers to taking advantages of banking system and perpetrating fraudulent transactions for unauthorized gains. Machine Learning algorithms like Logistic […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars CellStrat AI Lab is engaged in incredible AI innovations and product development activity. The sophistication of our AI Lab members’ presentations continues to rise week after week. Last Saturday AI Lab started with an interesting session on Image Descriptors, Feature Descriptors & Feature Vectors by Sonal Kukreja. In computer vision, image descriptors are descriptions of the visual […]
A new transfer learning technique called BERT (short for Bidirectional Encoder Representations for Transformers) is making big waves in the NLP space. BERT excels at handling “context-heavy” language problems. For example… Bats are found in the dark places. versus Cricket Bats are going high-tech these days. Earlier, context-free models (like word2vec or GloVe) generated a […]