Though tens of millions of dollars of investment has poured into the
space, fashion recognition apps have yet to fulfill their original
promise.
LONDON, United Kingdom — Fashion recognition apps
— often dubbed “Shazam for Fashion” — promised to make the entire world
shoppable. Anytime, anywhere, a user would be able to snap a piece of
inspiration — a sharply cut coat on a passerby, or a fetching mini-dress
in a magazine — and sophisticated “visual search” technology would
identify and retrieve a link to the item, available to instantly buy,
thereby radically shortening the path from inspiration to transaction.
Or so the pitch went. Over the last two years, investment has poured
into start-ups building fashion recognition apps. But are they working?
When this reporter used fashion recognition app Snap Fashion to
search for a white blouse, the first result was a taupe floral t-shirt. A
checked wool coat yielded a metallic jacquard blazer. On ASAP54,
another app using image recognition technology in the fashion space, a
search for a pair of tortoiseshell glasses pulled up red mirrored shades
and novelty pumpkin spectacles, while a grey cardigan returned a beaded
kimono. Other searches were more fortuitous — a navy blue shirt yielded
a good selection of blouses and a striped t-shirt surfaced a frenzy of
Breton tops. But the victories were few and far between.
If there is a fundamental problem with fashion recognition apps, it’s
the current state of the underlying technology itself. “When we look at
something, there’s a bit of the brain that does the pre-processing.
There’s then different parts of the brain that look at colour, shape,
motion, texture, so it splits the image up at that point. We mimicked
that process,” explains Iain McCready, CEO of Cortexica, a company
founded in 2009 by Dr Anil Anthony Bharath and Jeffery Ng at Imperial
College London and a leading developer of image recognition technology.
“But there are techniques that we’re using going forward to make those
results even better, like machine learning, deep learning, neutral
networks.” For the moment, however, image recognition technology is
simply not good enough to differentiate between fashion items, which
often have subtle but critical differences in cut and colour.
“If you look at image recognition to find exactly the same items, you
will be disappointed,” acknowledges Daniela Cecilio, CEO of ASAP54, a
fashion recognition app she founded in 2013 – despite the fact that
ASAP54 employs teams of human stylists to enhance its automated search
results. In February, ASAP54 raised $3.8 million in venture funding from
e.ventures, Ceyuan Ventures, Novel TMT and others. The app pulls search
results from a database of 1.7 million products from over 200
retailers, has racked up about 450,000 downloads and takes a commission
on sales. The company declined to reveal current revenue.
“I always like to think of visual search as a discovery engine not a
search engine,” says Jenny Griffiths, the founder and CEO of Snap
Fashion, a fashion recognition app that launched in 2012. Griffiths, a
computer science graduate, developed the algorithms employed by Snap
Fashion in her final year at Bristol University. “I always try to get
away from that whole ‘Shazam for fashion’ comparison, as I think that
sets the bar way too high in terms of what the technology can do,” she
continues.
In coming cycles, Snap Fashion plans to add small boutiques and
up-and-coming designers to its platform. “I like to think that in a way,
visual search is democratising fashion and opening up the platform for
new designers to be listed alongside the big high-street retailers,” she
adds. “So it gives designers a platform to grow, and it gives users a
platform to find new brands and mix things up a bit. I think it’s a
healthy shift in product discovery.”
Payment has also been a source of friction for users. Indeed, for
fashion recognition apps, which pull search results from a wide range of
retailers, the journey from picture to purchase is far from smooth.
Once they have found an item they want to buy, users are often
redirected to retailer sites and left to contend with whatever payment
processes they have in place, rather than being able to complete a
purchase within a given app. Snap Fashion reports that only 2 percent of
its users actually manage to make their way down the sales funnel and
complete a purchase. ASAP54 declined to share its conversion rate.
“It’s very difficult to launch a new app and generate installations
of it. It’s not easy,” says Marc Elfenbein, the CEO of Slyce, a visual
search firm that has raised a total of $28.7 million, to date, from
investors including Beacon Securities, Cormark Securities, Salman
Partners and Canaccord Genuity. According to Elfenbein, Slyce — which,
like ASAP54, also employs humans to enhance search results — has chosen
to white label its technology to retailers rather than launch a
standalone app because it meant, “there are already huge audiences in
place.”
What’s more, when the image recognition technology is deployed at the
level of a specific retailer, visual search results come from a limited
universe of products, making accurate results more likely. It’s not
quite as exciting as making the entire world shoppable, but it works.
In August this year, publicly-traded e-tailer Zalando tested an image
recognition app in Germany, enabling users to visually search its
catalogue of 150,000 products, in a bid to better monetise mobile users,
which at the end of Q2, accounted for 41 percent of visits to the site.
According to Cortexica, which provided the technology for Zalando’s
app, the e-tailer reported an uptick in sales as a result of the
deployment and now plans to roll out the app across Europe. “Sometimes
customers don’t have a specific product in mind,” says Christoph Lütke
Schelhowe, vice president of customer experience at Zalando. “It’s a
great complimentary approach that you can describe items not only with
words, but with images. An image speaks a thousand words. For customers
it’s a lot more convenient, which is why it’s so attractive.”
Last week, Neiman Marcus
launched ‘Snap. Find. Shop,’ a visual search feature on their mobile
app powered by Slyce. “You’re able to take a picture of a bag or shoe
and, then, with no further work from the user, you get the closest
comparable match from the Neiman Marcus catalogue. Then, you have the
ability to do a one-click checkout and purchase that item,” says
Elfenbein. The same technology can also help users locate similar items
in-store, or while browsing at the stores of competitors. “In the case
of Neiman Marcus, they specifically want it when their customers are in
their competitors’ stores,” he says.
The grand vision of better monetising desire by giving consumers
exactly what they want, at the very moment that they realise they want
it, remains out of reach, however. But what if fashion recognition
technology was integrated into large social platforms such as Instagram?
These sites are awash with fashion imagery, from street style images to
product shots. And with over 200 million users, a platform like
Instagram could leverage learning algorithms that return better results
with each use, in a way that start-ups simply cannot.
The potential is certainly there, but, for the moment, fashion
recognition technology remains a long way off from fulfilling the
popular adage that an image is worth a thousand words.