Show HN: Bring-your-own-key browser extension for summarizing HN posts with LLMs

github.com

74 points by ivanyu 7 days ago

Hi Hacker News,

I developed an open source browser extension for summarizing Hacker News articles with OpenAI and Anthropic LLMs. It currently supports Chrome [1] and Firefox [2] (desktop).

The extension adds the summarize buttons to the HN front page and article pages.

It is bring-your-own-key, i.e. there's no back end behind it and the usage is free, you insert your API key and pay only for tokens to your LLM provider.

[1] https://chromewebstore.google.com/detail/hacker-news-tldr/oo...

[2] https://addons.mozilla.org/ru/firefox/addon/hacker-news-tl-d...

androng 7 days ago

Related: to save on cost, Hacker News is already summarized and available in feed form which I find better than the default front page where you have to read the same headlines repeatedly because the order of the headlines changes on the front page. https://hackernews.betacat.io/ I also dislike how the titles on hacker news are so short that often they dont give enough information. e.g. headline: "Amazon Penterhorse". What is that?? that doesn't exist but the point is I have to click through to see it and its annoying. And on some posts when I click on some links the person's blog post is just way longer than my interest level so it doesn't get the point across. These summaries are just the right length.

  • francasso 7 days ago

    I just read the summary for "Fermat's Last Theorem – how it's going" on that page and it completely missed the point of the article

  • sabbaticaldev 6 days ago

    these “related” spams in HN are so annoying. Make your own show hn.

nickthegreek 7 days ago

Would be nice to be able to provide your own endpoint so it could be directed to a local llm.

  • ivanyu 7 days ago

    Thanks, good idea, this should be possible.

    • freedomben 7 days ago

      That would be a pretty killer feature IMHO. Ollama's API is pretty straightforward: https://github.com/ollama/ollama/blob/main/docs/api.md

      There is also (or at least used to be?) an OpenAI compatible API layer for Ollama so that may be an option as well, though my understanding is there are some downsides to using that.

      Note: This comment and the link are just meant as references/conveniences, not intended as a request for free labor. Thanks for opening up the code!

      • moffkalast 7 days ago

        Forget ollama, just changing the URL from openai to your local server is enough, llama.cpp has a compatible endpoint. Most people just don't bother giving the option since you get a CORS error if it doesn't have a valid cert.

        • freedomben 7 days ago

          Neat, I didn't know that! Thanks for the tip!

mattigames 7 days ago

This is so weird, why does every user has to use their LLM subscriptions/keys instead of just storing the summarization of the HN posts as a static website somewhere? (e.g. summarize once for all users)

  • ivanyu 7 days ago

    When I was incubating the idea, I thought about different concepts:

    1. The current bring-your-own-key.

    2. A central summary storage, filled by me.

    3. A central summary storage, crowdsourced.

    4. A paid subscription, where I effectively run some LLM proxy.

    I wanted something low overhead and be just the right size for yet another weekend project which I could drop at any moment. Supporting some infrastructure, having moderation headaches, let alone receiving payments ruled out pretty much everything but the current approach.

  • t-writescode 7 days ago

    That costs money to host a server, summarize yourself, etc.

    While it is possible, such a service would require advertisements, a (probably monthly) fee, or a benevolent patron to pay for the costs.

    With this, the only necessary benevolence is the creator of the extension.

    • rgbrgb 7 days ago

      the server cost is basically nothing since since it's static content. probably fits into free offerings from cloudflare.

      the summarization is where the cost is and it would be cool to have some crowdsourced contributor model there.

      smells like a cool crypto/ai crossover project in the making (or maybe drop the crypto and have upvote-driven "moderation").

  • mistrial9 7 days ago

    .. because being a threaded extension of some central hub connected to some other central hub is exactly the "vibe" ?

fudged71 7 days ago

I've been pretty happy lately with my setup.

Arc browser lets you hover over a link to show a card that summarizes the article.

With Claude Projects, I'm able to quickly build an Arc "Boost" User Script for any site, so I have one to export the HN homepage to JSON to import into an LLM. And I have one on comment pages to do the same. I have a userscript to remove pagination so I can infinitely scroll and then export.

Ad I have a Claude Project specifically for identifying/categorizing comment threads by the patterns of knowledge crystallization etc. It's been fascinating so far.

simonebrunozzi 7 days ago

Finally!

I tried to build something like this a few years back [0], I thought it was a great idea, but LLMs were not available yet, and I was busy with a hundred other things.

You can see an example of the summary there.

[0]: https://github.com/simonebrunozzi/MNMN

alaq 7 days ago

I'd love it if it could summarize the HN comments as well.

  • ivanyu 7 days ago

    Yeah, I should do comments. This is a popular feedback item.

ionwake 7 days ago

Brilliant, Love it. Excellent execution.

impure 7 days ago

I added something similar into my feed reader. Even summarizes comments. The problem is paywalls, sites that heavily use Javascript, and other bot protection measures (such as OpenAI's blog which I found a little bit ironic). But I guess you might be able to get around bot protections and JS if it's a browser extension.

consumer451 7 days ago

This sounds possibly useful, but I just don't trust extensions unless I can see the code.

  • dgosling84 7 days ago

    hmm even though the code is open source, how do you know that the published extension is bundled using the same code as what is open sourced? I guess I'm trying to say you'll have to be okay with some level of trust here unless you clone the open source project use it to load an unpublished extension for yourself

    • ivanyu 7 days ago

      That's understandable, I feel same when I install extensions. In both browsers, you can install the extension from local disk instead of the browser stores. The release artifact is a ZIP file with plain JS inside, no bundling, minification, preprocessing, you can check it out. Both Chrome and Mozilla did some inspection during several business days, but I can't say exactly what they checked and how diligently.