For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more santiago-pl's commentsregister

Thanks for raising it! Since vLLM has an OpenAI-compatible API, this should work for now:

  docker run --rm -p 8080:8080 \
    -e OPENAI_API_KEY="some-vllm-key-if-needed" \
    -e OPENAI_BASE_URL="http://host.docker.internal:11434/v1" \
    ...
    enterpilot/gomodel
I'll add a more convenient way to configure it in the coming days.

1. Yes, we have OpenAI-compatible API and we develop GoModel with Postel’s law in mind: https://gomodel.enterpilot.io/docs/about/technical-philosoph... .

2. Regarding being open-source and the license, I've described our approach here transparently: https://gomodel.enterpilot.io/docs/about/license


You are not the first person who has asked about it.

It looks like a useful feature to have. Therefore, I'll dig into this topic more broadly over the next few days and let you know here whether, and possibly when, we plan to add it.


This comment looks like AI-generated.

However IIUC what you're asking for - it's already in the dashboard! Check the Usage page.


Agree and thank you! Please let us know if you'd like to give it a try and if you miss any feature in GoModel.

First of all, GoModel doesn't have a separate private repository behind a paywall/license.

It's more lightweight and simpler. The Bifrost docker image looks 4x larger, at least for now.

IMO GoModel is more convenient for debugging and for seeing how your request flows through different layers of AI Gateways in the Audit Logs.


That would be valuable if there's a commitment to never have a non-opensource offering under GoModel? If so, you can document it in the repo.

I would love to keep it open source forever, but I can't promise that for now. I've written a whole doc page about it if you're curious: https://gomodel.enterpilot.io/docs/about/license

If your concern is someone selling GoModel as a service, you could add a license provision for that. Technically it'd no longer be open source, I think, but most people won't care.

I'll consider it for sure.

It reminds me of Anthropic's Super Bowl ad: “Can I get a six pack quickly?” It actually turned out to be true.


I like how the plots look!

In recent months, I’ve been making charts for the benchmarks just by talking to Claude Code or Codex: “Generate the charts for this and that.”

I could try pointing it to your project next time. It would be easier to do with some kind of easy-to-install skills for AI agents.

I think it’s an inevitable trend this year, something people call "building for agents." (I saw someone phrase it that way on X.)


It looks like Trivy was compromised at least five days ago. https://www.wiz.io/blog/trivy-compromised-teampcp-supply-cha...


It's always interesting to read how the government is introducing AI into its work.

"The programmer is responsible for code produced" - that is definitely worth putting into practice.

"Providing more context to AI models can dramatically improve their performance. (...)" — on the other hand, too much context might reduce their performance.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You