What Lies in Word Embeddings
Word Embeddings have become popular in recent years. They come in all shapes and sizes and they are often part of a recommended approach for NLP problems. In this talk I would like to demonstrate what merits you might expect but also where they fall short. We may need to tackle some hype in under to understand what “lies” in word embddings.
Vincent D. Warmerdam
My name is Vincent, ask me anything. I have been evangelizing data and open source for the last 5 years. You might know my from tech talks where I attempt to defend common sense over hype in data science.
I’m a self taught developer and data scientist currently living in the Netherlands. In the past I’ve been:
on the front page of reddit a ditigal nomad with an amazing postcard a preferred Rstudio training partner co-founder and co-chair of PyData Amsterdam co-founder of the satRdays Amsterdam conference a data/open source evangelist with a track record of speaking. co-creator of some open source packages; evol, scikit-lego, whatlies a youtuber (sortof) for the spacy project involved with many data projects; from bayesian recommenders to deep learning to data mining.
Currently I work as a Research Advocate at Rasa where I collaborate with the research team to explain and understand conversational systems better.