Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
As a front-end developer, I am always cautious about the birth of generative AI. Tools like Midjourney which generates images based on text prompts have fascinated many of us in the tech and creative industries, offering a glimpse into what’s possible when machine learning meets visual storytelling. However, if the recent court case filed by Disney and Universal Companies is entering a new, more complex phase of our journey, then there is a real implication regarding ethics, law, and innovation, which is a significant indication.
According to reports, Disney and Universal are saying: that Midjourney used their copyrighted characters—like Darth Vader (Star Wars) and the Minions (Despicable Me)—without asking for permission. That means the AI learned what these characters look like by using Disney and Universal’s original content. They claim that this has led to an “endless” stream of AI-generated images based on their intellectual property. The studios argue that Midjourney’s actions amount to digital piracy, plain and simple.
This legal case is not just a small or isolated argument between two companies but it’s the first time major Hollywood players have taken direct action against a generative AI platform, and it could shape how AI tools are allowed to operate in the future.
The argument from Midjourney’s side, especially as described by their CEO in past interviews, is that their model “learns like a person” by analyzing images across the internet. The comparison is that just as a human artist might study various styles to create something new, so too does the AI. That sounds harmless on the surface. But there’s an important distinction to be made here: a human studying and being influenced by art is not the same as an AI model being trained on copyrighted content and making many copies of something, repeatedly and in large quantities.
Imagine a tool trained on code you wrote, using it to generate projects for others without your permission, your name, or any credit. It feels wrong, doesn’t it? That’s how many artists and studios feel right now.
As developers, we often find ourselves at the point or place where different kinds of creative work or creative processes meet and overlap. We rely on open-source libraries, design inspiration, and shared knowledge. But we also understand the importance of giving proper credit to the original creator of something and respecting licenses. The Midjourney case highlights just how thin the line can be between building with data and misusing it.
This lawsuit also pushes us to ask bigger questions:
This legal battle might reshape the way generative AI is regulated, not just in the U.S., but globally. And it could affect everything from AI art platforms to music generators and even code-assist tools.
I don’t believe innovation should be stopped. But I do believe it should be fair and respectful. As tech professionals, we can’t afford to focus only on what’s possible, but we need to also ask what’s right. The tools we build, and use shape the future. That future should be welcoming, ethical, and grounded in mutual respect for all creators, whether human or algorithmic.
Whether you’re building interfaces, training models, or just experimenting with AI tools, this lawsuit is a wake-up call. It’s not just about legality; it’s about setting ethical standards in an era where the clear differences are becoming harder to see very quickly.
Having these questions is positive because it means I’m thinking deeply and being careful about how to use AI responsibly.