Meet the Design Assistant: He Will Steal Your Style
“With the power of AI and machine learning, the Design Assistant learns from you and makes suggestions based on your design… continually evolving and learning from your particular choices.”
— SolidWorks, 2025 (https://www.solidworks.com/lp/evolve-your-design-workflows-ai)
This sentence, taken from an official SolidWorks page, describes with a certain ease a process that is actually profoundly delicate: artificial intelligence observes, records, learns. In other words, it learns from those who design. Not simply to perform better, but to become more “intelligent”, more autonomous, more capable of anticipating and suggesting.
But if AI really learns from designers, then the stakes are anything but neutral. It means that the designer is no longer just a user: he is an active part in the training of a commercial system. A system that, over time, could reproduce its logic, its patterns, even its design style. All this without the user having informed consent or real control over the fate of that data.
Play
See content credentials
In publicly accessible legal documents, there does not appear to be an explicit reference to this type of use. There is no clear mention of the possibility that design data, even in aggregate or anonymized form, could be used to train AI models distributed on a large scale. Yet, promotional materials and the software features themselves suggest a dynamic of continuous learning, which occurs through daily use by designers.
This phenomenon is made even more paradoxical if we compare it with very different business models. Google, for example, offers a large part of its services for free and, in exchange, uses user data to improve its offering. It is an implicit, but well-known pact. CAD software houses, on the other hand, ask for high prices for access to the tools, and in the meantime seem to retain the right to collect and capitalize the value generated by users.
It is a double contribution: economic and cognitive. And there does not appear to be any transparency or compensation.
The issue becomes even more relevant if we look at what is happening in the world of images. Generative AIs are now capable of producing images “in the style of Studio Ghibli,” “in the manner of Moebius,” or “with the visual sensitivity of Saul Bass.” Behind every style, however, there is a vision, a career, a creative voice. And today we are rightly asking ourselves who has the right to replicate these visual identities, and under what conditions.
Now imagine transferring this same dynamic to product design. What if tomorrow a company asked an AI to generate a product “in the style of ...." ? What if the system had learned, over time, your way of designing and was able to reproduce it, without your intervention?
This is not an abstract hypothesis. It is an imminent scenario. And it concerns not only intellectual property but also professional identity.
Today, design is still a human, critical, situated act. But without a collective reflection on what we are giving up and to whom we risk it becoming, tomorrow, an automatable function. The problem is not artificial intelligence. It is the silence with which we have let it enter our tools, our workflows, our style without asking anything in return.
It’s time to ask the real questions. Who owns the intelligence that design produces? And what do we risk losing if we no longer recognize it as ours?
Let’s start the conversation. Now!