Bottletop’s Flagship Store – A Symbiosis of Sustainability and Tech

I know I’m not alone when I say it takes more to get me into a retail store these days than ever before.  Shopping online is the ultimate convenience, so stores have to go bold and offer something really special to get shoppers through the door.  Enter Bottletop, the sustainable luxury accessories brand with a newly launched flagship store on Regent Street sporting a KUKA robot in the window along with films telling the story of their responsibly sourced and produced products projected onto the store walls.  When it comes to fashion brands, this isn’t your average sustainability story.  Let me take a leap back and explain exactly what makes Bottletop a sustainable luxury brand and how their ethos extend from the product, to the store and then the engagement of cutting-edge robot technology in the form the KUKA LBR collaborative robot.

Render of final store – Image:  Bottletop

The Bottletop Fashion Company journey began in 2012 with co-founder Oliver Wayman’s mum picking up an up-cycled ring-pull and crochet bag in Salvador, Brazil – a neat way to fuse readily available waste and the craft of crochet, making a light and strong bag – and led to a partnership with artisans in Brazil that has grown into an atelier producing the brand’s signature products and developing new materials for future product lines.  Bottletop bags are made from discarded ring-pulls sourced in Brazil, along with locally sourced yarns for crochet and responsibly produced Brazilian leathers that are certified ‘Amazon Zero Deforestation‘, guaranteeing zero impact on protected forests from cattle farming and grazing.  Underpinning Bottletop’s fashion brand is the Bottletop Foundation, founded in 2002 by Oliver’s co-founder, Cameron Saul, which raises funds for social enterprise initiatives across Africa, Brazil and the UK.

So what spurred a sustainable fashion duo to delve into the world of robotics and 3D printed interiors for the launch of their flagship store in December this year?  At least in part, for reasons mentioned in my opening paragraph – retail needs to offer customers an experience and tell a story – but also because they wanted to do something different and juxtapose the hand-made natural elements of their products with a very high tech interior, according to Oliver.  “Using natural, sustainable materials would have been an obvious thing to do” he explained, but they wanted to be more ambitious than that, and offer their customers something unexpected.  A brain-storming session between Oliver and a friend Paolo Zilli at Zaha Hadid led to a discussion with KRA– USE ARCHITECTS, who were already exploring robotic manufacturing, and inspired the Bottletop team to delve into this brave new robo-tech retail world.  The team of collaborators then grew to include AI-build who are 3D printing interior surfaces designed by KRA– USE ARCHITECTS and Reflow who created the 3D printing filament from 100% recycled plastic.  The primary purpose of Oliver and Cameron’s tech-led shop fit and KUKA installation is to use technology as a storytelling tool and to foster an understanding amongst consumers about the place that new technologies have in our world and within their business – in this case facilitating the use of a new and exciting recycled plastic material in their store design and build.

A 3D printed wall panel shaped to hold bag handles for display

The instore storytelling of the Bottletop brand begins from the window display, featuring signature Paco Rabanne-esque ring-pull ‘‘bellani’ bags and the enamelled ‘Mistura’ clutches developed in collaboration with Narcisco Rodriguez, amongst which moves a KUKA robot 3D printing bag charms from 100% recycled plastic.  This recycled PET plastic was created from plastic bottles rescued from the ocean and processed into a thin printable plastic tube – a 3D printing filament.  The concept is akin to Parley for the Oceans collaboration with Adidas, which used plastic yarn in trainers and clothing, but instead of spinning the recovered plastic bottles into a yarn, Bottletop collaborators Reflow have processed the plastic into a continuous plastic filament, which the KUKA robot heats and extrudes through a 3D printing ‘gripper’ attachment fixed to the end of the robot arm that prints the bag charms by depositing successive layers of molten plastic – known as additive manufacturing.

In store, working alongside the robot was Daghan Cam of AI Build, who explained that in contrast to usual 3D printing filaments made from non-recycled plastic (including PLA), the recycled plastic filament is trickier to work with and has slightly different structural properties;  And here lies the commonality between Bottletop’s sustainable hybrid ring-pull/crochet/leather materials and this new recycled filament  – the experimentation to develop these new materials is a long and complex process, requiring considerable R&D and bags (pardon the pun) of passion and perseverance.  Oliver and Cameron have it in droves and as they talk me through the store’s 100% recycled rubber flooring and show me samples of the interior walls currently being printed at AI Build, to the products themselves, their dedication to both sustainable hand craft and cutting-edge technology, symbiotically, is inspiring. See how the product is made here.

It was a fitting choice to select a KUKA LBR robot to 3D print the bag charms in the shop window.  Working harmoniously alongside humans in a collaborative manner is the exact purpose of the KUKA LBR, with its inbuilt sensors to stop on contact, preventing it from causing injury to humans and with the absence of trap hazards for human hands, allowing easy and safe collaboration.  We undoubtedly have a growing dependence on technology and robots (although they are usually behind the scenes, carrying out repetitive manufacturing tasks unbeknown to most consumers), so seeing the KUKA LBR used as a creative tool to produce 100% recycled (and recyclable) products was a lovely example of cutting-edge tech enabling sustainable manufacturing.

KUKA LBR with Daghan from AI Build

The store interiors will be installed over the coming weeks, acting as a live installation, punctuated by the official launch last week at the Regent Street Store.  Attended by Mira Duma of Future Tech Lab FTL, Livia Firth of EcoAge and Professors Sandy Black and Dilys Williams of the Centre for Sustainable Fashion at London College of Fashion amongst other instrumental fashion and sustainability pioneers, the launch demonstrated how fusing fashion, technology and sustainability requires a commercial, creative and academic effort.  It was an interesting and enlightening night, with Oliver and Cameron proudly declaring Bottletop the first sustainable luxury brand on Regent Street.

party shots Image top: Left – Oliver Wayman, Right – Cameron Saul.  Above, the Bottletop Store launch party

Oliver and Cameron are excited about building the interior walls as a live installation that shoppers can see evolve, and I went behind the scenes to see some of the 100 wall panels being 3D printed by the KUKA KR90 6 axis arms at AI Build in East London.  The panels each take 7 hours to print and are individually sanded along the edges before being joined to create a unified wall panel for the store.  700 kg of 100% recycled plastic are going into the printing of the interiors at what Oli confirmed was the equivalent of around 60,000 recycled plastic bottles.  I also saw a demo of the 3D printed ceiling structure which is embedded with reclaimed cans in the store and captured in the shots below.

Behind the scenes at AI Build

The interior installation in store is expected to continue into mid-January, so be sure to pop in and see it evolve, alongside the KUKA LBR busily 3D printing  bag charms in the store window.

Header image and all images not otherwise credited: Techstyler

Follow Techstyler on Twitter and Instagram

Wired Next Generation Provides Next Level Inspiration

In the spirit of turning things upside down, breaking the rules and doing things my own way I’m going to start at the end.  Wired Next Generation followed two days of Wired 2016, but I’m going to serve up some Next Generation gems first.  

dsc03671My custom kicks by @moinart

Wired Next Generation is mental fertiliser, creative juice – food for thought –  which brings me neatly on to the work and wisdom of Heston Blumenthal.  A self-confessed irritant at school, always asking “why?”,  Heston’s curiosity and over-active senses have led him on a culinary and scientific journey inspired by human evolution and imagination.  Despite beginning his story with oysters he cites Einstein, rather than Rick Stein, as an inspiration and shares his infectious energy with a captive audience.

2016-11-06-1478449002-6129781-dsc03758-1

Inspiration was never in short supply at Wired Next Gen, with heart-warming accounts of the triumph of education over adversity in a refugee camp in Jordan by Syrian campaigner Muzoon Almellehan, who is now a proud resident of Newcastle and happily getting to grips with the Geordie accent.  Read Muzoon’s blog posts explaining her work and experiences here.

dsc03734maxresdefault

Samantha Payne presented a ground-breaking social, medical and technological innovation in the form of bionic upper limbs.  By using 3D scanning and printing, she and the team at Open Bionics have slashed the cost of bionic upper limbs by 10 times and produce them in four days rather than four months.  For the world’s 2 million upper limb amputees, this provides not only functional, but personalised affordable limbs.  Samantha’s work turns kids (and adults) into bionic superheroes, questioning the future superiority of human anatomy over bionic alternatives.  Empowering and moving, this transformative tech was demonstrated and explained by Tilly:

dsc03712img_3294starwarsironmanbionics

“What have you done today?” asked Oliver Franklin-Wallis, Assistant Editor of Wired, of the audience at Wired Next Gen after Google Science Fair award-winner Krtin Nithiyanandam explained how at the age of 15, he combined two antibodies to devise a way to detect early onset Alzheimers, drastically improving patient prognosis.  Since then, he has devised a way to alter untreatable breast cancer cells into treatable ones.  He is now 16.  Oliver, I’m going to pass on that question…


2016-11-06-1478450769-270978-dsc03705-thumb

cwgcce4weaazjph

Rounding off this article with a lyrical trip to the playground is Hussain Manawer.  After getting sacked from Sainsbury’s (for eating a doughnut on the shop floor) he moved on to a job at Primark, followed by a stint at Coca Cola.  He studied Quantity Surveying at the University of Westminster, which was “pretty dry” and went on to found his youtube channel Hussain’s House,  seeking to support youth causes through artistic expression.  He has since raised his voice at the One Young World Summit in Bangkok to speak for those who suffer from mental illness but struggle to be heard.  

Hussain’s speech won him a Rising Star Award and a place on a space flight in 2018, when he will become the first British Muslim to go to space. His honesty, candour and wit are disarming and charming.  Here’s a clip of him taking us back to the playground.

To play us out, here’s Swedish popstar MY, who is unsurprisingly difficult to track down on a Google search.  Here she is at Ohheymy.

Thank you Wired, for the sparks and the seeds that will help the next generation of bright minds to blow ours.  Bring it!

Follow Techstyler on Instagram and Twitter

Fashion’s Robots: McQueen v’s Plein

Just a day after publishing my first blogpost on AI, robotics and fashion came a runway show featuring all three. Designer Philipp Plein showed his SS16 collection at Milan Fashion Week featuring robot band Compressorhead with Courtney Love on vocals and robot arms ‘styling’ the models with sunglasses and bags as they travelled along a conveyor belt. There were also drones flying overhead, the function of which I am not sure.  Robots in fashion suddenly seems topical.

_ALE3027

Header image by NYTimes Image above by Wonderland Magazine

Dazed Digital reported that Philipp Plein’s shows are not normal shows (previous shows have included rappers on jet skis in customised Plein pools) but never had he taken a show this far in terms of concept and techno-grandeur. Each season he aims to outdo himself, apparently leaving the crowd wondering “what next?”  I wasn’t at the show, but watched it in full here

What interested me (beyond the impressive robot theatrics and investment in spectacle) was the reaction. A mixture of wonder, horror, anger and derision. I was lecturing today and played the show video for my fashion students and a handful replied in horror “but they’re just copying McQueen!” “McQueen did it better/first!” The rest were silent/amazed. Instagram is littered with similar comments. Well, McQueen did show industrial robots in his disturbingly beautiful S/S 1999 show, but with a very different theme. His concept was inspired by an installation by artist Rebecca Horn of two machine guns firing blood-red paint at each other.

rebeccaHorn-520x716    McQ.851a–d_mcq.851_v3.AV3

Hear model Shalom Harlow speak about her experience being painted by the industrial robots in McQueen’s show.

Philipp Plein’s show was a rock and roll glamour extravaganza in which the robots were a cool addition of props but not apparently integral to the delivery of the clothing and the collection’s narrative. It looked like one hell of a show though, and hugely entertaining. Nothing wrong with that, surely? Ellie Pithers at the Telegraph said “If it was just entertainment – and the clutch of blondes jiggling along next to me in their Plein studded boots and slashed jersey dresses certainly enjoyed themselves – then it was spot on.”

I think the integration of technology, AI, robotics in the fashion product and its delivery is key if such a show is to convince.  At the whiff of gimmickry, maybe the audience recoils a little? Authenticity is paramount in delivering a show that people connect with and truly buy into (on a deeper than commercial level). So, impressive as it is, maybe it’s not entirely convincing? Hashtag showoff?

1141278

PHILIPP-PLEIN-WOMEN-¦S-FASHION-SHOW-SS16-RUNWAY-@BFANYC-6-682x1024

_ALE3015

PHILIPP-PLEIN-WOMEN-¦S-FASHION-SHOW-SS16-RUNWAY-@BFANYC-31-682x1024

Images: Wonderland Magazine

The only connection between the McQueen and Plein shows is robots. The use and motivation behind them as technical tools is entirely different. With McQueen’s use of technology, the story he was telling and the clothes themselves were still always the main event and the driving force. It was reported that the industrial robot arms that painted Shalom Harlow’s dress #13 took a week to program in order to ensure a choreographed connection with Shalom and the expression of McQueen’s vision. In contrast, at the Plein show, Compressorhead were doing what they already do (see them in action below) and the industrial robot arms which are designed to perform the action of moving objects to/from a production line appeared to be doing just that. There’s no apparent integration of the technology and the reason for the show in the first place – the clothes. The show reviews I have read of Plein’s SS 16 collection barely mention the clothes. Indeed, the show began with Courtney Love and Compressorhead rocking out. It seemed to be a declaration of entertainment first, fashion second.

As a (cool, but slightly disturbing) aside, Compressorhead were created by Robocross machines, who have machines for all occasions. Check out Stickboy’s fascinating CV including date of birth (2007), specifics (four arms, two legs and one head) and playlist…

Screen Shot 2015-09-25 at 15.09.52

FingersBones

For those concluding the Plein show was a “ripoff” of McQueen’s, by that definition, any designer using the same machine/object/material as another designer would be copying too. However, it’s apparently fine to use the same materials/colours/silhouettes as other designers (indeed that helps to create a trend, so that’s quite useful for the industry) but it’s less acceptable to use the same technology/theme to present a show. Most designers wouldn’t go anywhere near a show theme utilised by McQueen. Since McQueen used the Pepper’s Ghost technique to project a flutteringly angelic Kate Moss, who’d go there? I’m not suggesting designers shouldn’t go there, I’m merely illustrating the point that they generally don’t. Except Plein.

Fashionista quoted Plein as saying he feels he’s an industry outsider, that he doesn’t have support from the industry and he has a deep rooted fear that people won’t turn up to his shows, as they didn’t in the beginning. He insists that as a self-funded, investor-less concern competing with the likes of Gucci and Chanel, his spectacular shows are par for the course and that he is spending less than his rivals (who aren’t criticised for such excess). There’s a definite tinge of derision in the article as they go on to claim he “rants” about robots taking over. The Elle headline below is also derisory. Fashionista do, however, admit that Milan Fashion Week would be a lot less fun without his spectacular shows.

Screen Shot 2015-09-25 at 15.16.05

Screen Shot 2015-09-25 at 17.05.56

About the Plein show, following her comment that the entertainment factor was spot on, Ellie Pithers went on to say “If it was meant to be a parody of the fashion industry – the conveyor belt demands of the schedule, the robotic nature of trends, the deliberate mechanics behind product placement – it was even better.” I’m not convinced there were metaphorical intentions and this, her final statement, seems a fairly cold conclusion. The choice of Kraftwerk’s, “Robots” and “The Model” further support the literal nature of the Plein’s message.

I think there’s a lesson to learn here about future conversations involving those in technology and fashion seeking to fuse disciplines. Integrate or irritate. Apple watch has been met with a tepid response for failing to create a genuine, desirable, aesthetic fusion. “Just because you put a strap on it doesn’t make it a watch”. Watch out Wearables.

Follow me:  Twitter @Thetechstyler  and  Instagram @techstyler

The Real and Imagined World of Robotics and Artificial Intelligence In Fashion

I am a hybrid. I began my career as a radiographer (x-raying and scanning patients) before studying fashion design and pattern-cutting. Upon graduation I worked simultaneously in hospitals and fashion studios, leading me to launch a fashion label in 2009, creating digital knitwear inspired by CT and MRI scans, named Brooke Roberts. I am now a designer, consultant, lecturer and blogger with a fascination for combining science, technology and fashion, both in a real and imagined sense. Anything is possible. 

ARTIFICIAL INTELLIGENCE AND ROBOTICS are generally met with a mixture of fascination and fear.  They’re certainly hot topics – Google car is the most recent high profile example, telling us that AI and robotics are developing quickly and will soon become part of our everyday lives.

Twitter fed me an intriguing tweet about a Robotics event called TAROS being hosted by The University of Liverpool Computer Science Department, headed by Professor Katie Atkinson.  TAROS was open to the public and aimed at de-mystifying AI and robotics and sharing advances and potential uses for it.  I couldn’t make it to TAROS but Professor Atkinson and the team agreed to an interview on another day and so begins Techstyler, a blog about science, technology and style.

CS dept

CS 2

My pre-reading led me to this summary about AI: it is aimed at having computers make decisions for us by analysing many factors (more than we could digest), in a quicker timeframe than we could analyse them, and coming up with an answer or series of solutions.  In other words, it streamlines the decision making process – so AI is not such a scary concept.

When it comes to robotics, it’s a common misconception that all robots are human-like, or at least that that’s the end goal of robotics.  In truth, many robots are built and programmed to do one very specific task and do not have any interactive or independent processing capability.  For example, they might pick up a pen from a conveyor belt and put it in a box.  They will repeat the exact same action over and over –  even if a human moves into their path, they will continue their programmed action.  Since killing people is NOT the aim of robots, this limitation is something that the team in the SmartLab at The University of Liverpool are working to solve, amongst many others.

Professor Atkinson (who introduced herself as Katie and is informal and very approachable) took me behind the scenes at the SmartLab to meet the team including PhD students Bastian Boecker, Joscha-David Fossel and Daniel Claes which is headed up by Professor Karl Tuyls. The SmartLab team are exploring how we can take robots away from the static limited industrial type to the type that move around and interact with their environment; they can be programmed to work in groups, based on biological models – like the way bees and ants work together – and they can use laser and infrared technology to understand where they are to detect objects and the presence of humans.

CS 11The guys in the SmartLab

CS 8The SmartLab team awards including the RoboCup Championship trophy, 2014

I spend an afternoon chatting to the guys in the SmartLab about their work and messing about with their robots, leaving me inspired and full of wonder (and them further behind in their PhD work).  They were really generous not just with their time, but in sharing their knowledge.  This is something that always humbles and inspires me in science – it exists to share knowledge and enlighten.  By contrast, the creative industries, and fashion in particular, can be very secretive and exclusive which may explain why, in many ways, it is resistant to change and evolution, paradoxically.  I talk about this in another upcoming blogpost with designer Sarah Angold, so stay tuned.

Daniel begins by telling me about RoboCup@work and the robots they’re developing with the aim of helping humans in a factory and advancing away from the static pre-programmed robot arms (which are really huge, heavy, limited to one action and housed in cages surrounded by lasers) to robots that can reason and react to their environment.  Total re-programming of static robots is required when the production or task changes within a factory, so this is an inconvenient and expensive limitation.

The robot prototype (featured in the video below and as yet unnamed – I ask why?!) is not static but on wheels and is able to analyse its environment and carry out a given task in response to its environment.  In this case, Daniel types in commands and the robot then responds to a specific request to pick up a bolt and nut from a table and drop them in the corresponding shaped holes at another table nearby.  It uses a camera to scan its environment (it has a set of environmental objects which it has been programmed to recognise) and collects then moves and drops the correct objects.  Here it is in action:

I took a couple of snaps of the pieces the robot picks up and the respective shapes the robot drops them into (Literal footnote: My boots are Dr Martens X Mark Wigan):

CS obj 1The items the robot chooses from

CS pc 2The respective shapes the robot drops the items into

My thoughts turn to the use of robots in fashion. Perhaps the most beautiful and arresting use of robots in fashion so far has been Alexander McQueen’s haunting SS 1999 show:

The choreography of the robot arms excites my imagination, like snakes priming to attack. The industrial robots in the McQueen show are the type used to paint cars in a manufacturing plant. I start wondering how robots could impact the fashion industry once the development the SmartLab guys are doing becomes accessible, particularly since biological algorithms are being used to develop robots that could potentially interact as a ‘choreographed group’.

Bastian is working on swarm robotics, which means having a bunch of robots that are controlled by themselves using really simple (I laugh and remind Bastian that’s a relative term) algorithms based on attraction and repulsion – just the same way a flock of birds form a swarm as a group, rather than from a centralised source.  They are really robust, for example you can take a couple of robots out of the system and the flock will still work.  Bastian imagines his flock being used for planetary exploration with a more localised connectivity, rather than for broadband-based military-type operations.  Katie further explains that having many small robots that are interlinked means that if one robot breaks down the others can continue exploring – the mission isn’t scuppered. Motion sensor-based communication between robots allows them to spread and cover a large area – for example an earthquake disaster zone – and scan the terrain for movement or people.

The main source of inspiration for Bastian’s algorithms that “drive” the robots is bees and ants. Both bees and ants use pheromone communication.  Bees use a dance for navigation which involves going back and forth to the beehive once they have found food, to demonstrate to other bees the vector pathway to the food source.  Bastian shows me a really cool video of how he and the SmartLab team created “bee robots” to perform this dance:

CS 10“Bee robots” in the SmartLab

Imagine the use of swarm algorithms to create choreographed robot runway models. The swarm algorithms would be supported by the development Joscha is doing on “state estimation” – the goal that a robot should know exactly where it is in relation to other things.

The runway robot models would know where they are in relation to each and could use lasers to interact with the crowd, making ‘eye contact’ with viewers and giving them a wink or cocking a hip. I’d love to see that. I’m re-imagining the Thierry Mugler show in George Michael’s Music Video “Too Funky” with robots as the models.

**Incidentally, this is the video that made me realise I loved fashion and wanted to work in the industry, back in 1992)**

Robot Naomi, anyone??

2009-12-09Naomi Campbell by Seb Janiak

Maybe I’ll start the first ever robot model agency? It may at first seem bizarre, but it would solve the problem of real people trying to achieve unreal proportions. I am not for a second saying I think the modelling profession is doomed or needs replacing (maybe just disrupting), I just think there’s merit in an alternative. If you think about the way the industry currently works, editorial images are manipulated often to extreme non-human proportions (again, this can be creatively interesting, so I’m not dismissing it entirely) but the human aspect is significantly diminished.

Another area of biological algorithms Bastian is working on is replication of ants’ passive technique of dropping pheromones to communicate with each other. This could be applied to digital pheromones designed to mimic this communication to divide labour between robots and solve complex problems.  Robots can’t create biological pheremones but there is development being done in this area.  Imaging adding the ability to exude pheromones to runway robot models and a whole new heady mix of interaction and attraction is possible. Here’s a demo of the digital pheromone communication concept:

The discussion moves to flying robots and Joscha explains further that State Estimation currently works well on ground robots, but with flying robots it gets more difficult because of the extra dimension of tilt/pitch.  X-box Kinect has a 3D camera that can be used on these type of robot prototypes.  Laser scanners can also be used to give plane of distance measurements.  Based on the time it takes for the laser light to reflect back off the surface of the objects surrounding it, the robot’s distance from those objects can be calculated.  It makes me think of medical imaging and OCT (Optical Coherence Tomography) technology, where it’s possible to use a laser inside of an artery to measure its dimensions – this is exactly the same principle, so I immediately get what Joscha is on about.  Using the laser scanners, Josche’s flying robots generate 3D maps:

I mention drones at this point and Katie explains that they avoid the term because of a general perception of military use, and instead use UAV’s (unmanned aerial vehicles).  It occurs to me again at this point that it must be a difficult and uphill battle to promote their advances in AI and robotics when such misconceptions exist.  Joscha shows me his flying UAV which uses silver reflective 3M tape as markers (which, coincidentally, I have used in yarn-form for knitting) to understand it’s exact location.  It is connected to a large geometric pitchfork-like object with silver balls on the end which allow me to direct it through the air with sweeping arm movements.  Here are the guys in action:

CS 7Joscha’s UAV with silver 3M balls

11821142_148793482121672_2109669878_nA jumper from my AW11 knitwear collection with 3M yarn and camel hair

The flying UAV makes me think of a video I saw recently of the brilliant Disney illustrator Glen Keane, who wore a VR headset and stepped into his illustrations using sweeping arm movements with a paddle to draw in the virtual space around him.  He immersed himself in his drawing and saw his illustrations in real time, 3D.  The video of him ‘stepping into the page’ is well worth a look.  So I guess I made an invisible air drawing with a robot.

I ask Katie whether there are barriers to spreading an accurate message about the work being done in AI and she cites “science fiction stories of dystopian societies where the robots take over” as a big source of misconception.  She is a passionate ambassador for AI, working on the research topic of ‘AI and law’ for over 10 years. She has been working with a local law firm for the past year to expand their use of AI and enable huge amounts of legal information to be analysed quickly and provide accurate and fast solutions.  The whole impetus behind AI is problem solving.  It’s about making our world a better and safer place, for example in disaster relief situations and with self-driving cars.  Katie shares some data on road traffic accidents: 97% of accidents are caused by human error.  That’s a compelling statistic.  Who doesn’t want to be safer on the road?  For the record, Katie tells me that self-driving cars, like those being developed by Google and Bosch, are expected to be launched in stages from partial self-driving initially, to 100% self-drive in the final stage.  So it won’t mean jumping into a vehicle and immediately relinquishing control.

Katie also shared with me some cool robots created by her Computer Science undergraduate students using Lego Mindstorms that demonstrate the concept of the self-driving car.

CS3

The ‘car’ has sensors on each side and at the front.  The sensors detect the lines on the board (road) and adjust their direction and speed accordingly in order to stay within the black lines (stay on the road) and speed up when they pass over the green lines and slow down/stop when they pass over the red.  They are programmed with code written by the students and uploaded directly from a PC into the basic computer box that comes with the Lego Mindstorm kit.  Pretty ingenious.

CS 6

I’m writing up this post while attending London Fashion Week, which has just relocated from Somerset House to the Brewer Street Car Park in Soho, on a pokey corner of a one-way street causing a crammed carnival/circus atmosphere and serious logistical issues. I wonder if Google car could solve this problem? Maybe editor-sized Google self-driving pods could locate their editor by GPS and navigate them quickly from one show to another while they focus on updating their Twitter and Instagram rather than worrying about traffic jams.  It would be a kind of cross between Google car and GM Holden’s Xiao – EN-V electric pod, charging itself in between shows. There’s no way those big old Mercedes cars that have been the long-term ferrying service for LFW editors and buyers are going to get in and out of town fast enough with the new LFW location. The frustration at LFW was palpable outside the shows.

car-side
Google Car
miao_en_v_electric_car-704159
GM Holden’s Xiao EN-V

So what can’t AI do?  I ask Katie about the use of AI and robotics in the creative industries.  Her response, “people regard this as the great separation between man and machine – the ability to be creative”, leaves me feeling relieved my job as a designer is safe.  Interestingly, a large part of Katie’s research work has involved categorising objective versus subjective information and determining how that information should be used in AI.

In terms of creative use of AI, examples exist in the gaming and film industries.  In the Lord of the Rings movies, programmers used crowd algorithms based on AI to program CGI characters in groups to move and interact together, rather than programming individual characters which then need to be programmed to interact with each other (that’s a lot of programming!)

So where is AI and robot technology headed? What’s next? Joscha pipes up and says he would like a robot at the supermarket that can collect everything he wants and bring it back to him.  That sounds like a weird and wonderful space to be in – robots and humans shopping alongside each other.  Katie then mentions the garment folding robot – this technology strikes me as useful for garment manufacturing (and at home, obviously).  The current incarnation folds one item quite well, but the SmartLab team are all about robots being able to not only fold the item, but then move across the warehouse and pack it. I personally would love to have a pattern-cutting robot to work alongside of me tracing-off and cutting out clean, final versions of patterns I have created and digitising them as it goes.

As I finish writing up this blog post, sipping on a cup of coffee that (I’m ashamed to admit) I just reheated in the microwave and I wonder how people felt about microwave ovens when they were first introduced.  Maybe there’s a similarity with AI or any new industry-changing technology?  People fear it because they don’t understand it but once they see how useful it is that fear subsides.  Elementary, my dear Watson.

Follow me:  Twitter @Thetechstyler  and  Instagram @techstyler