How Amazon knows what you're up to in its store
UEL academics assess impact of AI technology
Without thinking, a customer picks up a green apple at the grocery store. They have momentarily forgotten they prefer red apples. So, they replace the green apple and pick up a red apple.
A simple, everyday transaction. But fiendishly complicated for a computer to decipher and understand.
But this is an Amazon smart store. The online retail giant has deployed the latest technology to untangle that series of random actions into a coherent narrative of selection and purchase.
Later, mindlessly, the customer leaves without stopping to pay.
That’s fine. That’s how the Amazon store works. There are no cashiers. There are no tills.
Welcome to the future of retail. Welcome to the “just walk out” experience.
The new Amazon Fresh store in Ealing, London, and the latest in Wembley might look like a typical shop but behind the scenes – or up in the ceiling – is a whole world of inter-connected, cutting edge technologies that places the shopper, centre stage in a sophisticated machine learning enterprise that aims to predict and accommodate your tastes and intentions.
In the computer’s eye, the customer becomes an amalgamation of pixels whose movements must be tracked, monitored, predicted and captured.
The sheer scale of the enterprise has caught the eye of the three computer studies academics at the University of East London’s School of Architecture, Computing and Engineering.
Computer science lecturers Dr Mohammad Hossein Amirhosseini, Dr Nadeem Qazi and Dr Mustansar Ali Ghazanfar have been watching how advanced technology has been introduced into everyday life to create an Industry 4.0 experience.
Dr Amirhosseini said, “The most important difference between this store and other stores that are using self-service checkouts is that many cameras and AI power sensors track the items you take from the shelves and put in your basket and charges are automatically applied via the Amazon Go app.”
“Machine learning and deep learning techniques, computer vision, and data analytics are used to recognise the object. In fact, it is similar to autonomous vehicles that recognise objects, such as pedestrians, on the road.”
“When you pick up your items your shopping list will be updated in real time and when you finish you exit through specific sensor enabled barrier and the checkout process will be done automatically.”
The concept of a till-less smart shop was first tested in a mock supermarket in a rented warehouse in Seattle in 2015 by Amazon thinkers. The first real store opened to the public in January 2018, selling prepared foods, groceries and drinks.
The challenges for Amazon to realise this seamless experience were immense. Shopping is commonplace. To a computer – or series of them – it is a whirl of often confusing pixels making potential erratic and irrational movements and making and acting on choices and whims.
To turn that flow of information into a cohesive array of data points and to be accurate virtually every single time takes a considerable amount of computing processing power and thinking at the cutting edge of current AI technologies.
The store needs to be able to identify every individual in the store and track them (even when they might be half-hidden or in shadow), distinguish between different items on the shelves – including different flavours of the same drink for example – and detect a person’s activity near those shelves and understand if an item has been picked up and kept for purchase or returned.
Like an autonomous vehicle
So, Amazon deployed the latest thinking in deep learning, computer vision data analytics and AI algorithms to convert a simple visit to the shops into a complex interaction of object and person recognition.
The system works like an autonomous vehicle that segments what it sees into pixels and builds a map in its own mind about what’s going on, making weighted decisions about what it “sees” – identifying objects, placing them in the store and attempting to predict movement and recording completed actions.
All these different feeds are brought together in a process called sensor fusion to form a single model of the shopper’s activity, attaching significance to each action and its relevance to purchases.
The technology may have been developed and pioneered by Amazon, but artificial intelligence is coming to a high street near you. According to research group Gartner, some 77 per cent of retailers are putting AI in place this year in order to boost the customer experience, some driven by Amazon’s ground-breaking innovations.
Dr Amirhosseini said the typical fear – of robots taking human jobs – need not play out.
He said, “The rapid adoption of ATMS from 1980 to 2000 increased the number of bank teller jobs throughout that period because ATMs lowered the cost of opening new branches. This led to banks hiring more bank related workers. But the job description changed. They began to handle more customer relations as their more routine tasks were being automated.”
“So, the process of automation may disrupt industries but ultimately it will create new and better jobs and grow the economy. Cognitive skills have become more important than physical activity, because of the AI revolution.”
The irony stands therefore that
Amazon, which has taken the lion’s share of retail business during lockdown and
has been seen as the greatest threat to the viability of the high street may be
leading the march back to local shopping.
If you are interested in learning more about artificial intelligence, machine learning and the Internet of Things in preparation for a career in the Industry 4.0 economy, see our computer science course pages.
Main image of the first Amazon Go store in Seattle: By Sikander Iqbal, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=71909829