HP has laptop subscriptions now
-
found this on a linus tech tips video https://www.youtube.com/watch?v=o4e-Kt02rfc
PC Subscription with Monthly Payment Plans | HP Laptop Subscription
Stay up to date with the latest tech with HP Laptop Subscription for access to a new PC, accessories, and trade up option after 1 year in monthly payments.
(hplaptopsubscription.hp.com)
-
found this on a linus tech tips video https://www.youtube.com/watch?v=o4e-Kt02rfc
PC Subscription with Monthly Payment Plans | HP Laptop Subscription
Stay up to date with the latest tech with HP Laptop Subscription for access to a new PC, accessories, and trade up option after 1 year in monthly payments.
(hplaptopsubscription.hp.com)
7.HP AI Companion requires 32GB RAM for local processing.
WHAT THE FUCK
-
7.HP AI Companion requires 32GB RAM for local processing.
WHAT THE FUCK
What is surprising about this? LLMs are giant memory consumers.
-
What is surprising about this? LLMs are giant memory consumers.
yea, i'm surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect
fuck AI, fuck HP, and fuck "laptop subscription"
the saddest thing is, people will sign up, if for no other reason than they have no other option
-
yea, i'm surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect
fuck AI, fuck HP, and fuck "laptop subscription"
the saddest thing is, people will sign up, if for no other reason than they have no other option
yea, i’m surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect
32GB is actually considered the bare minimum for most of the common locally run LLM models. Most folks don't run a locally run LLM. They use a cloud service, so they don't need a huge pile of RAM locally. However, more privacy focused or heavy users with cost concerns might choose to run an LLM locally so they're not paying per token. With regards to locally run LLMs, this would be comparable to renting car when you need it vs buying one outright. If you only need a car once a year, renting is clearly the better choice. If you're driving to work everyday then clearly buying the car yourself is a better deal overall.
You are perfectly fine not liking AI, but you're also out-of-touch if you think 32GB is too big for anything. Lots of other use cases need 32GB or more and have nothing to do with AI.
I agree with your frustration with subscription laptops. I hope people don't use it.
-
yea, i’m surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect
32GB is actually considered the bare minimum for most of the common locally run LLM models. Most folks don't run a locally run LLM. They use a cloud service, so they don't need a huge pile of RAM locally. However, more privacy focused or heavy users with cost concerns might choose to run an LLM locally so they're not paying per token. With regards to locally run LLMs, this would be comparable to renting car when you need it vs buying one outright. If you only need a car once a year, renting is clearly the better choice. If you're driving to work everyday then clearly buying the car yourself is a better deal overall.
You are perfectly fine not liking AI, but you're also out-of-touch if you think 32GB is too big for anything. Lots of other use cases need 32GB or more and have nothing to do with AI.
I agree with your frustration with subscription laptops. I hope people don't use it.
It all reads like a giant racket. AI requires 32GB of RAM on your laptop, 32GB of RAM is expensive, so you have to lease, and it's expensive because AI requires RAM to run in the cloud. It's a problem in search of a solution, and it keeps making new problems along the way.
-
It all reads like a giant racket. AI requires 32GB of RAM on your laptop, 32GB of RAM is expensive, so you have to lease, and it's expensive because AI requires RAM to run in the cloud. It's a problem in search of a solution, and it keeps making new problems along the way.
Its only a problem if you want to run AI. If you don't want AI locally or cloud based, then no need to spend the money on the high end 32GB model (for AI purposes) or paying for a cloud subscription. No one is required to get the 32GB model if they don't want it.
Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Register Login