May I ask what hardware do you use to run this application? The community’s about section mentioned that the pinned post had some info, but I couldn’t find it…
May I ask what modela you use to generate these? I got DiffusionBee, and the two models it downloaded by default, while impressive on their own (I mean… text to images? Magic!), the results are nowhere near as good as your images.
I think it’s that you need to be able to throw parallel processing at a lot of RAM. If you want to do that on a PC you need a GPU that has a lot of RAM, which you can only really buy as part of a beefy GPU. You can’t buy a midrange GPU and duct tape DIMMs onto it.
The Apple Silicon architecture has an OK GPU in it, but because of how it’s integrated with the CPU, all the RAM in the system is GPU RAM. So Apple Silicon Macs can really punch above their weight for AI applications, because they can use a lot more RAM.
If you make a subtly sensual photorealistic Tinkerbell and post it, I’ll worship you forever.
I’m forever your humble servant.
These are great! Thank you!!!
Cheers!
May I ask what hardware do you use to run this application? The community’s about section mentioned that the pinned post had some info, but I couldn’t find it…
I’m using a MacBook Pro M2 Max with 32gb ram.
Get the hell out, for reals?!
I always thought that this kind of application would require a beefy desktop computer with powerful GPUs.
Are these images generated offline?
It’s a ridiculously powerful machine. Running AI stuff caused the fan to spin on for the first time. It destroys everything else.
May I ask what modela you use to generate these? I got DiffusionBee, and the two models it downloaded by default, while impressive on their own (I mean… text to images? Magic!), the results are nowhere near as good as your images.
I think it’s that you need to be able to throw parallel processing at a lot of RAM. If you want to do that on a PC you need a GPU that has a lot of RAM, which you can only really buy as part of a beefy GPU. You can’t buy a midrange GPU and duct tape DIMMs onto it.
The Apple Silicon architecture has an OK GPU in it, but because of how it’s integrated with the CPU, all the RAM in the system is GPU RAM. So Apple Silicon Macs can really punch above their weight for AI applications, because they can use a lot more RAM.
M2 is a damn good chip and this is coming from someone who has not and probably never will buy an Apple product.
Yes, they’re genrated offline.
What did you run to generate these? They turned out pretty coherent if not fantastic.
Model, sampler and postprocessing?
Hmm… Lemme see something - hope these are what you meant
https://pxlmo.com/i/web/post/587255699614443749
https://pxlmo.com/i/web/post/587255982785964290
If you feed me a prompt I’ll go HAM for ya .
I can up the photorealism next time, this one is more of instagram photo photorealism heh. Probably because I put in ’ photoshoot’ hmmm
Sure, I’ll give it a go!