natural language processing
Survey forms, invoices, bills, memos, tenders, and myriad such documents which businesses face on daily basis. These are full of numbers, characters, images, signatures etc…in almost all languages worldwide…. hard to imagine our world without any languages. Think about how many text and voice data we face every day. What about determining meaning from this […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #WhereLearningNeverStops Last Saturday, AI Researcher Indrajit Singh presented a marvellous workshop on “Bidirectional Encoder Representations from Transformers” – also called “BERT” in short. We are in the era of Pre-trained models in AI. The figure below shows tons of pre-trained models which have taken root. BERT is one of the core models […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #WhereLearningNeverStops Last Saturday, our AI Lab Researcher Indrajit Singh presented an exhaustive webinar on Transformers, which are used in NLP. Introduction To combine the advantages from both CNNs and RNNs, [Vaswani et al., 2017] designed a novel architecture using the attention mechanism. This architecture, which is called as Transformer, achieves parallelization by […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars Last Saturday, AI Researcher Indrajit Singh presented a superb workshop on Dependency Parsing used in NLP. The topics covered in this workshop included :- Understand Dependency Parsing Syntactic Structure: Consistency and Dependency Dependency Grammar and Treebanks Transition-based dependency parsing Dependency Parsing involves detecting which words depend on which other words. Dependencies are […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #AlwaysUpskilling #CellStratPrime Minutes from Saturday 22nd Feb 2020 AI Lab meetup at BLR :- Last Saturday our AI Researchers presented some amazing algorithms in the AI Lab. Graph Neural Networks :- First Pushparaj M. presented an excellent seminar on Graph Neural Networks (GNN). A Graph is a set of nodes (vertices) connected […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars Last Saturday (11th Jan 2020), CellStrat AI Lab Team Lead Indrajit Singh presented a superb workshop on “Text-2-speech (TTS) protocol using Tacotron and Tacotron 2” algorithm. Here is a summary of the Tacotron algorithm :- For a fan of the Marvel Cinematic Universe, the voice of J.A.R.V.I.S, the AI managing Tony Stark’s […]
Transfer Learning is a good and popular approach in deep learning in which pre-trained models are used as starting point on computer vision (CNN) and natural language processing (NLP) tasks. It is popular in deep learning given the enormous resources required to train deep learning models or the large and challenging datasets on which deep […]
Models #pretrained on domain/application specific corpus — #BioBERT (biomedical text), #SciBERT (scientific publications), #ClinicalBERT (clinical notes). Training on domain specific corpus has shown to yield better performance when fine-tuning them on downstream #NLP tasks like #NER etc. for those domains, in comparison to fine tuning #BERT (which was trained on BooksCorpus and Wikipedia)
A new transfer learning technique called BERT (short for Bidirectional Encoder Representations for Transformers) is making big waves in the NLP space. BERT excels at handling “context-heavy” language problems. For example… Bats are found in the dark places. versus Cricket Bats are going high-tech these days. Earlier, context-free models (like word2vec or GloVe) generated a […]