Do AI pictures have a place?

I understand that AI generated pictures are usually frowned upon but I do think that they can give people some inspiration to make real life art from it.

Such as this picture done in AI but would love to see someone actually make something similar to it in real life. I have mess with a Lego type Lora and you be surprise at some of the details you can re-imagine for your own builds.
Just wondering what do you all think of AI/Stable Diffusion/etc.?

When it first came out I loved it, I started creating loads of images, mostly of TARDISs on alien landscapes (although they never got the Police Box right).

Then I read more about how they stole lots of people’s art to create the models, and the news about it keeps getting worse and worse, so now I’m not a fan. Also it puts real artists out of a job.

It’s a shame because it is so cool. But I can’t in good conscience use it.


But that is a lovely piece, I wish someone could make it in real life!

That is why I think people need to use it as a helping guide instead of put what you want in and pick the best. A handful of pictures aka drafts and expand on them but I know that is not the people use it for.

I tend to feel it’s a grey area. The AI doesn’t actually keep any of the pictures it was trained on. (In fact, if you compare the size of the model to the number of pictures, no compression technique is that good…)

The process is along these lines. You take a large folder of pictures with descriptions. Grab one picture and tokenize the description. Slice that picture into squares and pick a square. Take a square the same size of random noise, then try denoising techniques on it until it looks similar to the square from the picture. Then file away that denoising method with the tokens, toss the square, and keep training. All the training data is tossed at the end.

Later, you type in a prompt and it generates some more random noise, and uses the denoising methods that go along with the tokens from your prompt on it.

The pictures used for training are also ones that were freely downloadable on the internet, and someone learning how to draw is also likely to train themselves to draw like an artist they like without permission as well…

Not saying I don’t see the artists point, I just think it’s a lot more complicated then stolen art. I’ll also say that a lot of ai art needs input and often fixing up by a real artist. AI art of a person is often a lot like those creatures from Wild Blue Yonder, and I have a feeling there may have been some commentary there…

(It’s also one of these that kinda fascinates me, because I’m good at technology and interested in art, and definitely interested in ai. I do label anything ai generated as such, though, and on Mastadon, CW it.)

1 Like

Yes I agree with those exact points but many artists still believe it is stealing images and stealing work.

There was also a recent case where the New York Times found that ChatGTP could spit out their articles exactly as they were written, all from its model, so the LLMs at least (text based AI) do contain all the originals.

Oh well, it’s difficult to know what to do about it now that the genie is out of the bottle!

I need loads of art for my site and don’t have the budget to pay an artist so I did consider using AI but thought it would put lots of people off. I posted some AI art on social media and got shouted at haha!

1 Like

Fortunately, most of the ai art I’ve been doing is for my own entertainment and mostly not public facing. My avatar over on Mastadon is currently ai art, though the avatar here is an older one I drew myself.

(And there is something called “overtraining” that can happen when you feed it too many nearly identical pictures in the training…)

One unfortunate thing I can potentially see happening is that the free ai art generators ending up becoming illegal… but ai art generators like Adobes, where they can claim to own everything it’s trained on, staying legal. And Adobe’s not really that much more ethical.

They trained it on their stock library, which, sure, people signed their rights away from for money, but some of it was ai art from other ai’s, and the people selling their rights didn’t know it was going to be used for ai training.

I do feel like the fact that I can generate pretty good ai art using nothing more than my computer without even getting online means it’s here to stay in some form, legal or not.

(Of course, part of me thinks the main problem is that a small group of people have most of the money, and that less jobs being needed should just mean that less people need to work. :stuck_out_tongue: )

1 Like

Yes exactly, boo capitalism. In an ideal society it wouldn’t matter that AI can take over all jobs because people don’t need to work, like in Start Trek.

But let’s not get political on here :rofl:

1 Like

No problem, probably time to go back to Doctor Who discussion…

1 Like

There’s a trope for that!

Need to think of other stories that are about this:

(It getting the synopsis of one of the stories for this short description is a bug at the moment…)

1 Like

Well, “Kerblam!” should have been about that trope. I was fully expecting it to be.

I could argue Dalek and The Long Game…


Kerblam is “Capitalism is good, actually” lol

Yeah The Long Game and Dalek are good shouts.

Thanks. Honestly, thinking about it, I could practically argue season 1.

Aliens of London/World War 3.

1 Like

Yeah! I’ll add those later :smiley:

I also have a trope called “Profiting from Time Travel” which is related but specifically for people who try to get rich using time travel. Warning, contains spoilers.

… and we have gone pretty off topic here haha

Well, that’d also be The Long Game, as well as City of Death.

Sometimes I nudge controversial topics off-topic, I’ll admit, though a separate thread for naming episodes that belong to tropes would be a good idea.

It’s like a onion it has layers and isn’t a straight yes no answer

It’s frustrating because you can tell that Kerblam is not trying to be that it just stumbles into it because it does an atrocious job of getting its actual message across


What is kablams actual message?

It’s going for a sort of “the system/automation is not bad it’s being abused by managers using it to replace/monitor workers and by Charlie using it to kill” but it kinda fails to deliver that clearly and stumbles into being easily read as “the corporation is fine and it’s just this one individual who was a problem”

Not to mention the stereotyping involved in who the terrorist was…