Skip to main content

For this interview we invited Sarah Downs, our Director of Client Solutions, to talk with Dr. Dorian Selz, the Co-Founder and CEO of Squirro. Prior to co-founding Squirro, Dorian founded the Swiss search platform local.ch making it a market leader within four years. Prior to this he was a Partner and COO at Namics AG – the largest eBusiness consultancy in Switzerland and Germany.

Dorian holds a PhD from the University of St. Gallen and a Master’s in Economics from the University of Geneva. During the conversation Sarah and Dorian discussed his career evolution, Squirro’s embrace of LLM (Large Language Model), AI technology and the emerging relationship with Synaptica.

SD: Your career path demonstrates you are very much a serial entrepreneur. Tell us about your career evolution to this point.

DS: I’m from a small farming village in the middle of nowhere, a remote part of Switzerland. I had the opportunity to study in Geneva and it was there I was involved with a student’s organization, the European Student’s Forum. This opened up the world for me, including the chance to study for a PhD. Then I studied in Scotland, in Aberdeen, and completed my PhD at the Business School Universitat St. Gallen.

It was at this time, the mid-90s, that we founded our first company supporting web sites: Namics AG, an eBusiness consultancy with a strong presence in Germany. We provided classic professional services work – designing the first web pages of companies like UBS and Siemens – working through specific customer engagements or projects. It was one of those projects which led to my next company, local.ch. This grew to become Switzerland’s largest homegrown website and search engine. Swisscom had a majority stake in the company, and it was later sold.

My next startup was an online note-taking tool similar to Evernote. We tried to raise venture capital funding, which was challenging in Europe compared to the US. This type of product was popular and we were very competitive. But we learnt a hard lesson. Anyone, Evernote, us, and other Freemium proponents never achieved the levels of paying users required to sustain the business model. You can’t survive as a restaurant if your patrons eat the food but don’t pay for it.

We took the opportunity to reinvent ourselves and that’s where Squirro comes in. We took what we had effectively done in the past and twisted it, applying it to Enterprise Search.

Making sure the appropriate information comes to you at the moment you need it, in the context you need it. That's the vision that makes me get up in the morning.

SD: I’m interested in the way you’ve described your next business coming out of something that’s happening in your current business. What was the moment of realization you had another potential business in Squirro?

DS: It’s about information. Making sure the appropriate information comes to you at the moment you need it, in the context you need it. That’s the vision that makes me get up in the morning. If you take the information coming to you when you need it, this implies a few things. You don’t want everything to come at you at once; that would be too much. The aim is for the appropriate things to come to you when needed. What you want is the appropriate information at the appropriate moment in the appropriate quantity.

What you want is the appropriate information at the appropriate moment in the appropriate quantity.

SD: Your team at Squirro has grown to over 30 people. What do you look for when you’re hiring new team members? 

DS: I want to collaborate with people I can hang out with for a long time. I’ve been with my two co-founders for the past 25 years. It’s been the greatest privilege of my professional life. Some of the people we have built this company with have been together for an exceedingly long time.

There is a famous book Good to Great by Jim Collins. What he said is you need to get the right people on the bus, get the wrong people off the bus, and then decide which direction to take.

SD: Interesting. You start with the people and only then find the problem.

DS: The moment you’re in business pursuing an ambitious idea, it’s more like an expedition. As an expedition you’re together for months on the way to Mars. NASA selects the teammates that go together for a space mission. It’s important that they want to go, take part in training, be physically fit, and technically brilliant. But that is not the defining criteria. What they look for is team composition. In space, if something goes wrong, can these guys figure it out together as a team? Can they rely on each other? This doesn’t mean that you’re the best friends ever. But you need to work it out together.

SD: In 2023 Squirro embraced LLM AI technology, and rapidly developed solutions for Retrieval Augmented Generation (RAG). Can you share with us your overall vision for LLMs?

DS: Large Language Models (LLMs) have been around for a while, before ChatGPT. This type of technology has been created through open innovation introduced a few of years ago, and eventually led to the creation of companies like OpenAI. These LLMs are thoroughly trained neural networks, which sounds quite complicated and fancy, but at the end of the day their function is quite simple – they probabilistically predict the next word.

Once you understand this, you understand where the strengths and weaknesses are. Their strengths are immediately recognizable: after extensive training, they can, out-of-the-box, recognize structure in text. But out-of-the-box, LLMs also have pretty big disadvantages. First, there’s a probability they often compute incorrectly – these are “hallucinations.”

The second disadvantage is a bit counterintuitive. Large Language Models are actually pretty bad at ingesting substantial amounts of data at high speed. Today it is very costly and takes time to train a Large Language Model. This is not something that works in a day-to-day, fast-moving business context. The volume of enterprise business demands would be an issue – a telecom provider has literally thousands of customer tickets a day. It’s impossible, almost useless, to try and manage this through LLMs today.

The third issue relates to the disregard for enterprise security. As a company, we immediately, and intuitively, understood that LLMs were going to be a game-changer. And that the LLM’s drawbacks are effectively the strong points of an existing enterprise search engine. But there is a lot of work to combine these two approaches and build something that you can deploy at the enterprise level.

That’s what we’ve done during the past 12 months. The reality is: AI was a minority sport a year ago. Now, no one can deny its impact and reach with the advent of these innovative technologies. The combination of techniques allows you to create business applications that can have massive economic impact.

SD: What do people get wrong about LLMs and AI right now?

DS: My analogy would be about the advent of social media platforms. When they launched everyone was expecting that Utopia had arrived – we are all going to be one big loving group worldwide because we have all these connections.

Turns out that these methods of connecting and communicating have also developed into platforms for hate speech, negativity, and toxic online behaviour. Are these social networks doing good for the world? In the last 20 years we have experienced civil wars, terrorist attacks, and online abuse. It’s a fair question to ask: are these inventions that went wrong? Either way, the technology’s use and impact turned out to be vastly different from our expectations.

In another analogy: I remember 25 years ago we did the first test with eCommerce and everyone told us: “No one is going to spend that much; no one is going to order stereo equipment online; there is a maximum to what people will be willing to spend over the internet.” The same with fresh food: “No one will buy vegetables online because of the logistics involved. It won’t work.” These predictions have been proven to be wrong.

When it comes to LLMs, I don’t think the chat application is going to be where this technology has the greatest impact. It’s the focus of a lot of time and energy right now, but it’s not where LLMs will end up in 5 years or 20 years.

When it comes to LLMs, I don’t think the chat application is going to be where this technology has the greatest impact. It’s the focus of a lot of time and energy right now, but it’s not where LLMs will end up in 5 years or 20 years.

SD: Where do you think the greater promise of LLMs lies?

DS: Let’s orient ourselves with a simple view of the enterprise value chain – product is created and then sold. In this context, chat interaction on the sales side is just a tiny piece of the overall business function. In all the other parts of that process chain, especially for mid to large sized companies, you have massive amounts of activity. And within this you have many, many underperforming processes. The way companies operate today may be efficient in today’s world, but with the advent of LLMs you can 5x, 10x improve these processes.

I think LLMs are going to support a massive change in the way businesses are run. The real value will be the ability to optimize entire process changes that you couldn’t before. The euphoria around LLM-driven chat applications will settle down. I’m curious to see how many large organizations can navigate that maelstrom.

There will be a few people that will manage the transition. You will see the emergence of unexpected winners in that race. We’re also going to see transformative reconfiguration of entire value chains. You’re going to see reconfiguration of the way we buy books or organize our next trip away. The way we organize the production of whatever we produce.

What Dave Clarke and the team are delivering at the moment is a real breakthrough. Hardly anyone has mastered the business integration element – knowledge graphs as a way to describe business processes - but the team at Synaptica do.

SD: As you prepare yourself for this new world, how do you see Synaptica products enhancing Squirro solutions for your customers?

DS: What Dave Clarke and the team are delivering at the moment is a real breakthrough. Everyone thinks of a Knowledge Graph as a Knowledge Graph – it’s a good place to start. Hardly anyone has mastered the business integration element – knowledge graphs as a way to describe business processes – but the team at Synaptica do.

Let’s discuss an example: we spoke with a mobile telecom company. They frequently launch new product names for the same basic mobile plan. One is called Safe Plus, the next Safe Plus + One. Then we have Talk Cheap, followed by Talk Cheap Unlimited. And if you’re talking about technology back-ends, this multiplicity of naming becomes even more complicated: you have Cisco router 5EF-B, Cisco router 5EF-C etc.

This is where structured data is so powerful for organizing varied and complex information, and when you combine structured data with LLMs you get Retrieval Augmented Generation (RAG). Now you have a transformative stack available. You can immediately transform the entire customer support system. You solve the complexity of change – you can reorganize the whole business process in an efficient way.

The real thing that I want to achieve through our partnership with Synaptica is a transformed ability to respond to a service ticket or call. The individual responding to the service request can search through different systems, look up resources and find the solution. Let’s transform that role.

Look at air traffic control. You are dealing with multiple planes arriving at an airport, and they all require landing space. There is a danger that planes fly too close together. The Air Traffic controller can see who is arriving in sequence on a screen, and they can orchestrate the various flights. The systems then collaborate directly with the relevant stakeholders, ground staff, luggage transport. All the people involved in the landing process.

This same approach could work for our customer service person. They orchestrate the various inbound tickets or calls but their role becomes a conductor of business processes. It’s a fundamental change.

SD: I like that customer support example, because we see our customers building a lot of taxonomies around managing customer support, like classifying product and service issues. I think it’s an area where a lot of enterprises have focused on taxonomies. It’s also a place where I think people tried to immediately adopt the chat application with mixed results. There seems to be this obvious fit between LLMs and customer service provision. But you’re really imagining a much more mature, integrated approach: getting people the right information at the right time and place – using the power of LLM and the power of search and the power of taxonomies to get there – letting the machines do what they do well and empowering humans to do what they do better.

DS: Yes, you immediately transform the entire customer support system. You solve the complexity of change – you can reorganize the whole business process in an efficient way.

Synaptica Insights is our popular series of use cases sharing stories, news, and learning from our customers, partners, influencers, and colleagues. You can review the full list of Insight interviews online.

Author Vivs Long-Ferguson

Marketing Manager at Synaptica LLC. Joined in 2017, leads on marketing, social media and executive operations.

More posts by Vivs Long-Ferguson