Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: a desktop app to discover, download, and run offline LLMs (lmstudio.ai)
10 points by ybu on May 24, 2023 | hide | past | favorite | 7 comments


Wow, this makes running LLMs locally significantly more approachable. Very cool!


Agreed!


This is great! CPU only for now or can it leverage GPU for sped up inference?


Focus is CPU for now - but it'd be very useful to have GPU support. As a baseline, this can support anything that llama.cpp. Relevant link: https://github.com/ggerganov/llama.cpp#blas-build


Looks very useful! Can't wait to try the Windows version. Thanks!


Follow up: Windows release is out.


hopefully soon!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: