Fashion Mannequins Transformed Through 3D Printing

To chat to Paul Sohi is to geek out over all things 3D printed.  He takes me on a journey from 3D printed mannequins (the subject of his PhD) to a new polycarbonate composite prosthetic leg he is developing with a team spanning half a dozen countries but centred at Autodesk in San Francisco, for an Olympic cyclist bound for Rio later this year.  It’s a helluva ride, so buckle up!

What initially prompted me to get in contact with Paul was a question I’ve been pondering whilst working at the fringe of fashion and technology for some time.  Why aren’t there robot models?  And why don’t I create the first robot modelling agency?  It makes sense for so many reasons, but more on that in a later post.

Paul’s research and development at the Royal College of Art in conjunction with the Makerversity at Somerset House centres on solving an immense problem in mannequin manufacturing.  Mannequins are currently sculpted by hand before being moulded and cast – a time consuming process which imposes mass standardisation.  As someone who has hired mannequins for London fashion week I can attest to the limited offer currently on the market. Consider a museum requiring a custom-sized mannequin to display historic clothing, and then consider a new technology allowing such a mannequin to be 3D printed in days rather than laboriously hand made in months.  Then consider that currently, the best way of creating mannequins to display such costumes is to 3D scan the clothing to determine the volume inside of them when worn on which to then base a mannequin shape – requiring reverse-engineering of the mannequin to mimic someone that did actually live and wear those clothes at a point in historical time.  It’s on consideration of these weird truths it’s possible to begin seeing the benefit of Paul’s creation of an algorithm designed to transform actual body (or garment) measurements into 3D printed mannequins, rather than relying on artistic creations inspired by – but anatomically untrue to – the human body.  The key here is that measurements entered into Paul’s program are manipulated and represented visually in line with actual metamorphic landmarks.  For example, height has an impact on body proportions.  It is incorrect to simply scale a mannequin up or down directly proportionately – there are intricacies in height ratios that Paul’s rigorous algorithm takes into account so that the mannequins he 3D prints are true to the human form, rather than a sculpted representation of an imagined ideal.  Shorter people’s legs are proportionately longer than their torso compared to taller people, for example, but you would not detect this by looking at them – both body proportions simply look ‘right’.  Herein lies the difficulty in artistically interpreting the human form where size and fit are concerned.  

IMG_1749DSC01633Paul’s 3D printed scale mannequins being printed in parts for later assembly

The motivation behind Paul’s PhD was to find out if he could create a 3D printed mannequin using mass customisation algorithms built upon an immense amount of research underpinned by the International Standards.  These standards provided all the necessary body measurements to create a digital mannequin which can then be 3D printed.  



An important point made by Paul during our conversation is that mass customisation via 3D printing is now possible on a production scale – it has evolved beyond prototyping.  This means that standardisation of mannequins is no longer necessary and the skilled work required for each fashion retail market does not have to be localised.  Since a ‘standard’ size small in Asia is nothing like a ‘standard’ size small in Europe, mass customisation shatters geographical boundaries and means standardisation – at best badly sized and limited in terms of body shape and at worst pushing damagingly unrealistic body ideals – is no longer necessary.  The mannequins Paul is developing can be tailored according to cultural specificity.  Regional cuisine radically effects body shape, size and proportion and genetics also has a considerable impact.  These factors can be taken into account in Paul’s algorithm.

A complete 3D printed scale mannequin

From an aesthetic point of view, every fashion brand has its own ideal mannequin which in some cases may be seasonal.  These are made from master moulds and if done by hand using current methods, take months.   3D printing takes a fraction of the time, allowing greater flexibility and mannequin diversity.

Components of a scale model printing whilst Paul and I chatted at the Makerversity

Paul describes his work as creating avatars and body forms.  He is currently working with the Victoria & Albert Museum, London to find rapid solutions for mannequin making for display of historic costumes.  As an extension of this revolutionary development for display mannequins, Paul is looking at how the current mass standardisation of garment making mannequins relates to sizing within the fashion industry.  There is no datum on mannequins – no system for sizing and no standard approach to it across the industry.  When creating clothing, we have anatomical landmarks (nape to waist, for example) but the way this is measured is still variable.  Paul is determined to standardise measurement taking and sizing to put an end to what is a slow, laborious and repetitive process.  He makes the point that, for example, three people in the fashion industry will measure the same dress and get three completely different sets of measurements .  Compare that to Architecture – or any other creative industry – and you would be laughed at for not having and applying a set of standards.  He makes a strong point and I have personally dealt with this often painful aspect of sampling and production in the fashion industry.  Paul is confident that a set of standards can be extrapolated from the points mapped in his algorithm.


Interestingly, Paul tells me that the standard nape to waist measurement of garment making blocks used routinely in the fashion industry came from 1920’s military uniforms.  Today’s approach to garment sizing and pattern proportions has only marginally evolved since then.  ‘Standard sizes’ are in truth specific to each individual fashion house and are not related to any actual standard, which to me makes sense because each fashion house/brand has it’s own silhouettes and ‘fit’ which are part of its aesthetic, but I can see how this isn’t customer friendly and how in an increasingly e-commerce driven industry sizing standardisation would reduce returns and help consumers make better style choices.

Returning to museum mannequins, the Alexander McQueen exhibition Savage Beauty at the Metropolitan Museum of Art in New York was one of the most successful exhibitions of all time but despite this, when it ended it was not picked up immediately by another institution.  The hand-sculpted mannequins, made specifically for the garments they displayed were destroyed.  Shortly after, the V&A took on the exhibition, and set about hand-making the mannequins all over again.  Almost a year later they were complete.   If these had been created using Paul’s 3D printing method this would have been simpler, quicker and less expensive.

main_imageSavage Beauty, The Metropolitan Museum of Art, New York 

The exception within the Savage Beauty exhibition was the Plato’s Atlantis 3D printed mannequins which closed the exhibition.  See the 3D rendering by Asylums FX and a photo I took at the exhibition below:

IMG_0999Plato’s Atlantis, Savage Beauty, Victoria and Albert Museum, London

When I ask Paul about the response so far to his work he says it has been met with distrust and caution from a number of museum curators and fashion designers who feel things are working just fine as they.  The fashion industry is famously and paradoxically resistant to change (the out-of-synch seasonal cycles and some luxury brands still refusing to sell online are just two examples) but why isn’t the way things are done being challenged?  Why can’t we do things better?  Why can’t we explore technology to do things in a better way?  As long as we pose the questions, it appears technology will provide the answers.

Paul and I leave the Makerversity disagreeing over the recent Batman V’s Superman film (he’s a fan, I’m not) and agreeing on the amazingness of The Hulk.  I wish him well on his bumpy but worthwhile journey to fashion mannequin disruption.

Header image: Paul Sohi

For more about Paul’s work, click here and follow him on Twitter

Follow me:  Twitter @Thetechstyler  and  Instagram @techstyler

McQueen and a Bee Named Beyoncé

Bees have been featuring greatly in my life lately. From blogpost one featuring Bastian Broecker’s robot swarm algorithms based on the behaviour of bees, to a wonderful gift of bee pollen and propolis-laden honey from Pablo Villasenin of Toca honey in Galicia, which threw me back into rude health after a hectic time at London Fashion Week, to a very special Pearly Queen beekeeping session this morning at Stepney City Farm in East London. It was special because I learnt more about how bees communicate via a waggle dance that is fundamentally based on physics and also managed to spot the Queen Bee amongst a hive of around 8,000 bees. Well, she is called Beyoncé so it’s not surprising she was dominating the crowd and making her presence felt. I also got to wear a Beekeeping suit (I am an ardent onesie fan – usually of the pilot suit variety) so stylistically, I felt right at home.






The Queen bee rules the hive for her lifetime of (up to five years), long outliving the worker bees (other female bees) and drones (male bees) whose lifespan is around 6 weeks. The Queen bee mates once, one mile in the air about the hive (mile high club, anyone?) with up to 20 drones. She stores the semen in her bountiful “hips” for her lifetime, fertilising her eggs according to how/when she wishes to populate the hive. She can lay up to 2000 fertilised eggs per day. Interestingly, John from Pearly Queen tells us that there are a group of French bees located near an M&M factory in Alsace have been making coloured honey after visiting waste sites containing the coloured shells. M&M flavoured honey, anyone?



The fascinating bee behaviour and the protective suits, mesh hats and long gloves got me thinking about beekeeping-inspired fashion. My research led me to Alexander McQueen SS13 and Jean Paul Gaultier’s SS15 collections.

Understanding the powerful pheromone-driven sexuality of the Queen bee and devastating and sudden demise of all drones who mate with her, the premise for Sarah Burton’s dark sexually charged bee-inspired SS13 collection for Alexander McQueen is clear and potent.  The collection featured a metamorphosing hexagonal digital backdrop, honeycomb jacquards and tortoiseshell accessories and cage-like bodices with ornamental bees, reminding me of the wooden frames of the hive I saw today and adding a rich, glossy, amber quality like the nectar and pollen inside the honeycomb.







If I was developing a collection inspired by beekeeping I’d consider the corruption of honey by artificial colours from the M&M factory and take it to a more techno place. I’d draw influence from beekeeping suits and all-over protective clothing including NASA space suits, as well as functional fastenings, including zips. I’m also a fan of the gauzey drapery around the neck and in the mesh of vintage beekeeping attire. I’d also develop knitted structures to mimic hexagonal shapes and create a honeycomb dimension. Here are my beekeeping/spacesuit/functional outerwear mood boards.mood board 1

mood board 2

Mood board credits: Givenchy Haute Couture, Toogood Outwear, Noemi Anna Tina Ceresola, Nasa Space Suits.

Header Image: Raquel Zimmerman by David Sims. 

Follow me:  Twitter @Thetechstyler  and  Instagram @techstyler

Fashion’s Robots: McQueen v’s Plein

Just a day after publishing my first blogpost on AI, robotics and fashion came a runway show featuring all three. Designer Philipp Plein showed his SS16 collection at Milan Fashion Week featuring robot band Compressorhead with Courtney Love on vocals and robot arms ‘styling’ the models with sunglasses and bags as they travelled along a conveyor belt. There were also drones flying overhead, the function of which I am not sure.  Robots in fashion suddenly seems topical.


Header image by NYTimes Image above by Wonderland Magazine

Dazed Digital reported that Philipp Plein’s shows are not normal shows (previous shows have included rappers on jet skis in customised Plein pools) but never had he taken a show this far in terms of concept and techno-grandeur. Each season he aims to outdo himself, apparently leaving the crowd wondering “what next?”  I wasn’t at the show, but watched it in full here

What interested me (beyond the impressive robot theatrics and investment in spectacle) was the reaction. A mixture of wonder, horror, anger and derision. I was lecturing today and played the show video for my fashion students and a handful replied in horror “but they’re just copying McQueen!” “McQueen did it better/first!” The rest were silent/amazed. Instagram is littered with similar comments. Well, McQueen did show industrial robots in his disturbingly beautiful S/S 1999 show, but with a very different theme. His concept was inspired by an installation by artist Rebecca Horn of two machine guns firing blood-red paint at each other.

rebeccaHorn-520x716    McQ.851a–d_mcq.851_v3.AV3

Hear model Shalom Harlow speak about her experience being painted by the industrial robots in McQueen’s show.

Philipp Plein’s show was a rock and roll glamour extravaganza in which the robots were a cool addition of props but not apparently integral to the delivery of the clothing and the collection’s narrative. It looked like one hell of a show though, and hugely entertaining. Nothing wrong with that, surely? Ellie Pithers at the Telegraph said “If it was just entertainment – and the clutch of blondes jiggling along next to me in their Plein studded boots and slashed jersey dresses certainly enjoyed themselves – then it was spot on.”

I think the integration of technology, AI, robotics in the fashion product and its delivery is key if such a show is to convince.  At the whiff of gimmickry, maybe the audience recoils a little? Authenticity is paramount in delivering a show that people connect with and truly buy into (on a deeper than commercial level). So, impressive as it is, maybe it’s not entirely convincing? Hashtag showoff?





Images: Wonderland Magazine

The only connection between the McQueen and Plein shows is robots. The use and motivation behind them as technical tools is entirely different. With McQueen’s use of technology, the story he was telling and the clothes themselves were still always the main event and the driving force. It was reported that the industrial robot arms that painted Shalom Harlow’s dress #13 took a week to program in order to ensure a choreographed connection with Shalom and the expression of McQueen’s vision. In contrast, at the Plein show, Compressorhead were doing what they already do (see them in action below) and the industrial robot arms which are designed to perform the action of moving objects to/from a production line appeared to be doing just that. There’s no apparent integration of the technology and the reason for the show in the first place – the clothes. The show reviews I have read of Plein’s SS 16 collection barely mention the clothes. Indeed, the show began with Courtney Love and Compressorhead rocking out. It seemed to be a declaration of entertainment first, fashion second.

As a (cool, but slightly disturbing) aside, Compressorhead were created by Robocross machines, who have machines for all occasions. Check out Stickboy’s fascinating CV including date of birth (2007), specifics (four arms, two legs and one head) and playlist…

Screen Shot 2015-09-25 at 15.09.52


For those concluding the Plein show was a “ripoff” of McQueen’s, by that definition, any designer using the same machine/object/material as another designer would be copying too. However, it’s apparently fine to use the same materials/colours/silhouettes as other designers (indeed that helps to create a trend, so that’s quite useful for the industry) but it’s less acceptable to use the same technology/theme to present a show. Most designers wouldn’t go anywhere near a show theme utilised by McQueen. Since McQueen used the Pepper’s Ghost technique to project a flutteringly angelic Kate Moss, who’d go there? I’m not suggesting designers shouldn’t go there, I’m merely illustrating the point that they generally don’t. Except Plein.

Fashionista quoted Plein as saying he feels he’s an industry outsider, that he doesn’t have support from the industry and he has a deep rooted fear that people won’t turn up to his shows, as they didn’t in the beginning. He insists that as a self-funded, investor-less concern competing with the likes of Gucci and Chanel, his spectacular shows are par for the course and that he is spending less than his rivals (who aren’t criticised for such excess). There’s a definite tinge of derision in the article as they go on to claim he “rants” about robots taking over. The Elle headline below is also derisory. Fashionista do, however, admit that Milan Fashion Week would be a lot less fun without his spectacular shows.

Screen Shot 2015-09-25 at 15.16.05

Screen Shot 2015-09-25 at 17.05.56

About the Plein show, following her comment that the entertainment factor was spot on, Ellie Pithers went on to say “If it was meant to be a parody of the fashion industry – the conveyor belt demands of the schedule, the robotic nature of trends, the deliberate mechanics behind product placement – it was even better.” I’m not convinced there were metaphorical intentions and this, her final statement, seems a fairly cold conclusion. The choice of Kraftwerk’s, “Robots” and “The Model” further support the literal nature of the Plein’s message.

I think there’s a lesson to learn here about future conversations involving those in technology and fashion seeking to fuse disciplines. Integrate or irritate. Apple watch has been met with a tepid response for failing to create a genuine, desirable, aesthetic fusion. “Just because you put a strap on it doesn’t make it a watch”. Watch out Wearables.

Follow me:  Twitter @Thetechstyler  and  Instagram @techstyler