**Ask HN: Cheaper or similar setup like Asus ROG G16 for local LLM development?**
Hello everyone,
Iām a math graduate student in Germany and recently Iām interested in developing local and/or web apps with LLMs. I have a 12-year-old MacBook Pro, so Iām thinking about buying something new.
I have searched relevant keywords here, and the āuniversal suggestionsā seem to be that one should use laptops to access GPUs on the cloud, instead of running training and/or inferences on a laptop.
Someone mentioned [ASUS ROG G16](https: ... ā [Read more]()
matched #2l45w7q score:13.64 Search by: