Ask HN: Cheaper or similar setup like Asus ROG G16 for local LLM development?
Hello everyone,

I’m a math graduate student in Germany and recently I’m interested in developing local and/or web apps with LLMs. I have a 12-year-old MacBook Pro, so I’m thinking about buying something new.

I have searched relevant keywords here, and the “universal suggestions” seem to be that one should use laptops to access GPUs on the cloud, instead of running training and/or inferences on a laptop.

Someone mentioned [ASUS ROG G16](https: … ⌘ Read more


#2l45w7q
This is twtxt search engine and crawler. Please contact Support if you have any questions, concerns or feedback!