Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Beware: summarisations might not be accurate due to the nature of the model

I'd like more information about this before installing it.

Why are there no examples? The one example on the page is too small to read.

What personal information does the plugin collect?

Not trying to be negative. The plugin might be useful. It's difficult to make an assessment without this information.



Plugin collects users Google OpenID and number of characters used for summarization. No emails or other personal information is collected

Summarization request gets proxied to the Huggingface Inference API. According to https://huggingface.co/inference-api they claim to protect the data and not share it with third parties.

On examples - thank you for the comment, I'm thinking of recording a video and improving screenshots etc. Right now it's the earliest launch to hear people opinions

I'm not sure if the model can dismiss facts. From BART model paper:

"Because BART has an autoregressive decoder, it can be directly fine tuned for sequence generation tasks such as abstractive question answering and summarization. In both of these tasks, information is copied from the input but manipulated, which is closely related to the denoising pre-training objective."

"Information is copied but manipulated" - idk if it can lead to the factual mistakes and disinformation. Knowledgable opinions are best welcomed!


here's an example:

http://www.paulgraham.com/hwh.html

There are three ingredients in great work: natural ability, practice, and effort . To do the best work you need all three: you need great natural ability and to have practiced a lot and to be trying very hard . The most basic level of which is simply to feel you should be working without anyone telling you to .


Thanks!

Pretty good. I remember reading that article ang finding it difficult to get through. He needs an editor.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: