This is a contributed piece from Emil Eifrem, CEO of Neo Technology, the company behind graph database, Neo4j.
Amazon has taught us the value of being able to predict what else customers might want to buy, by analysing online sales data. It’s a lesson that any retailer wishing to survive needs to start learning – and applying.
But to do so, retailers need not only to know about my past purchases, but be able to instantly combine this knowledge with any new interest shown during the customer’s current visit to offer recommendations.
How? Simple: they need to understand the customer intent by analysing a host of clues offered by the customer, interrogate this data at lightning speed to serve up uncannily relevant recommendations and so generate great, tailored offers – offers that become increasingly more accurate, as the recommendation engine gathers more data and learns more about the customer in the process.
To accomplish this requires a combination of NL (Natural Language) processing, ML (Machine Learning), accurate predictive analytics, a distributed, real-time storage and processing engine and, I contend, a graph database to make all the real-time data connections required.
Why do I say that? Let’s look at a real-world example of just such a combination – eBay’s AI-based ShopBot is built on top of a graph database. That graph layer directly enables the system to answer sophisticated questions like, ‘I am looking for a brown, leather Coach messenger bag costing less than $100, please find me those’.
ShopBot asks qualifying questions and will quickly serve up relevant product examples to choose from. The functionality is impressive – you can send the bot a photo with a direction such as, ’I like these sunglasses, can you find similar models?’ and it will employ visual image recognition and machine learning to figure out similar products for you, in milliseconds.
All this is done by using NL processing techniques to figure out your intent (text, picture and speech, but also spelling and grammar intention are parsed for meaning and context), while the graph database (using Neo4j) helps to refine the search against inventory with context – a way of representing connections based on shopper intent that’s shaping up to be key to the bot making sense of the world in order to help you.
That context is stored, so that the ShopBot can remember it for future interactions. So when a shopper searches for ‘brown bags’ for example, it knows what details to ask next like type, style, brand, budget or size. And as it accumulates this information by traversing the graph database, the application is able to quickly zero in on specific product recommendations.
Why relational isn’t your best friend here
Tapping into human intent like this and delivering highly responsive, accurate help is the Holy Grail of what applied AI can offer. In this discussion on conversational commerce the example is well made: in response to a statement, My wife and I are going camping in Lake Tahoe next week, we need a tent, most search engines would react to the word ‘tent’ and the additional context regarding location, temperature, tent size, scenery, etc. is typically lost.
This matters, as it’s this specific information that actually informs many buying decisions – and which graphs help empower computers to learn. Context drives sophisticated shopping behaviour, and graph technology is the way to open it up for a retailer.
But you can’t get there the way you’re going now. The traditional way of storing data is ‘store and retrieve’, but that doesn’t give you much in terms of context and connections – and for your searches and recommendations to be useful, context needs to come in.
To help improve meaning and precision, you need richer search, which is what AI-enriched applications such as chatbots give us.
Graph databases are now one of the central pillars of the future of applied AI, and graph is shaping up as the most practical way of getting there.