Natural Language Processing
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars Minutes from Saturday 7th Dec AI Lab meetup at BLR :- Last Saturday we had fabulous presentations by AI Lab members in BLR. Recommender Systems :- In our Hebbal AI Lab, the day started with a deep session on Recommender Systems by Gurumoorthy Loganathan. Recommender system is a subclass of information filtering […]
Transfer Learning is a good and popular approach in deep learning in which pre-trained models are used as starting point on computer vision (CNN) and natural language processing (NLP) tasks. It is popular in deep learning given the enormous resources required to train deep learning models or the large and challenging datasets on which deep […]
Models #pretrained on domain/application specific corpus — #BioBERT (biomedical text), #SciBERT (scientific publications), #ClinicalBERT (clinical notes). Training on domain specific corpus has shown to yield better performance when fine-tuning them on downstream #NLP tasks like #NER etc. for those domains, in comparison to fine tuning #BERT (which was trained on BooksCorpus and Wikipedia)
The #Transformer is a deep machine learning model introduced in 2017, used primarily in the field of natural language processing (#NLP). Like recurrent neural networks (#RNN), #Transformers are designed to handle ordered sequences of data, such as natural language, for various tasks such as machine translation and text summarization. (Src: Wikipedia)