AI Art Training Data as Exploitation of Artists
Commercial AI art companies like Midjourney, Stability AI, and OpenAI train image models on massive scraped datasets containing work from thousands of illustrators, concept artists, and photographers who never explicitly consented or received compensation.
Automation vs Exploitation in AI Art
AI art systems function as powerful tools for non-artists, hobbyists, and working illustrators, while simultaneously acting as labor-saving devices for studios and clients who might otherwise hire human artists.
Data Laundering and Nonprofit Loopholes in AI Art
Dataset builders like LAION, commercial AI companies such as Stability AI, Midjourney, and OpenAI, and the artists and ordinary users whose images—sometimes private or sensitive—end up in scraped datasets without consent.
Skill Theft and the Problem of Digital Ownership
Digital artists whose styles become widely imitated, AI companies training on their work, and everyday users who type specific artist names into prompts to get better-looking generated images.
Opt-Out Systems and the Impossibility of Model Forgetting
Artists searching sites like Have I Been Trained, companies such as Spawning AI that offer opt-out tools, and model builders like Stability AI that promise to respect removal requests in future model versions.
Fair Use vs Industrial-Scale AI Scraping
YouTubers and critics who reuse clips or images under fair use, AI art companies claiming similar protection for training scrapes, copyright holders, and the artists whose work becomes raw material for model training.
Practical Ethics for Using AI Art
Individual creators who enjoy AI art, YouTubers who monetize generative visuals, and everyday users deciding which tools to pay for, as well as the artists whose livelihoods are indirectly affected by these choices.