OkOkPlayer OP t1_j23u7ou wrote on December 29, 2022 at 1:39 PM Reply to comment by vizim in [D] Is there an affordable way to host a diffusers Stable Diffusion model publicly on the Internet for "real-time"-inference? (CPU or Serverless GPU?) by OkOkPlayer No. I only saw "closed beta" but they have a documentation for it. But because my project is currently stopped due to other reasons I have not further looked into it. Permalink Parent 2
OkOkPlayer OP t1_j23tq7b wrote on December 29, 2022 at 1:35 PM Reply to comment by vizim in [D] Is there an affordable way to host a diffusers Stable Diffusion model publicly on the Internet for "real-time"-inference? (CPU or Serverless GPU?) by OkOkPlayer No unfortunately not. replicate looked the most promising but for own custom models it is closed beta atm. Permalink Parent 1
OkOkPlayer OP t1_iz3posd wrote on December 6, 2022 at 5:59 AM Reply to comment by MonstarGaming in [D] Is there an affordable way to host a diffusers Stable Diffusion model publicly on the Internet for "real-time"-inference? (CPU or Serverless GPU?) by OkOkPlayer Yes but this is only CPU not GPU if I understood correctly. Permalink Parent 2
[D] Is there an affordable way to host a diffusers Stable Diffusion model publicly on the Internet for "real-time"-inference? (CPU or Serverless GPU?) Submitted by OkOkPlayer t3_zdfrnw on December 5, 2022 at 6:52 PM in MachineLearning 13 comments 8
OkOkPlayer OP t1_j23u7ou wrote
Reply to comment by vizim in [D] Is there an affordable way to host a diffusers Stable Diffusion model publicly on the Internet for "real-time"-inference? (CPU or Serverless GPU?) by OkOkPlayer
No. I only saw "closed beta" but they have a documentation for it. But because my project is currently stopped due to other reasons I have not further looked into it.