you are viewing a single comment's thread.

view the rest of the comments →

[–]Hematomato 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (2 children)

It's not the first. But I've tried to run uncensored LLMs before, and here's the thing: you need a monster of a graphics card to get a decent result. With my little 8GB GPU, LLMs are fucking retarded.

To get something that's pushing the boundaries of the Turing Test, you're gonna need to pay Nvidia about, oh, $12K.

[–]Myocarditis-Man 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (1 child)

If the GPU market was not in the toilet, mainstream cards would not come with a useless 8GB of memory in 2023. Maybe Intel will put pressure on the AMD/Nvidia cartel, if Nvidia does not squeeze them out of the market first.

https://overclock3d.net/news/gpu_displays/nvidia_reportedly_pressures_partners_to_stop_them_building_next_gen_intel_battlemage_gpus/

Till then, there's used 24GB Tesla cards from Ebay.

[–]iamonlyoneman[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

Ebay

correct. Retail is for suckers when you can afford to wait a week to find the right part!