Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Underrated by people are unfamiliar with machine learning, maybe.


I actually tend to agree. In the article, I didn't see the strong argument highlighting what powerful feature exactly people were missing in relation to embeddings. Those who work in ML they probably know these basics.

It is a nice read though - explaining the basics of vector spaces, similarity and how it is used in modern ML applications.


> Hopefully it's clear from the domain name and intro that I'm suggesting technical writers are underrating how useful embeddings can be in our work. I know ML practitioners do not underrate them.

https://news.ycombinator.com/item?id=42014036

> I didn't see the strong argument highlighting what powerful feature exactly people were missing in relation to embeddings

I had to leave out specific applications as "an exercise for the reader" for various reasons. Long story short, embeddings provide a path to make progress on some of the fundamental problems of technical writing.


thank you for explanation, yes I later encountered your answer and upvoted it.

> I had to leave out specific applications as "an exercise for the reader" this is very unfortunate. would be very interesting to hear some intel :)


LLMs have nearly completely sucked the oxygen out of the room when it comes to machine learning or "AI".

I'm shocked at the number of startups, etc you see trying to do RAG, etc that basically have no idea what they are, how they actually work, etc.

The "R" in RAG stands for retrieval - as in the entire field of information retrieval. But let's ignore that and skip right to the "G" (generative)...

Garbage in, garbage out people!


Even by ML people from 25 years ago. It’s a black box function that maps from a ~30k space to a ~1k space. It’s a better function then things like PCA, but does the same thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: