TMS and Conversational AI: The End of the Click, the Beginning of Dialogue

For twenty years, TMS platforms have been measured by the number of widgets displayed on screen. More filters, more columns, more dashboards. The result: dispatchers spending their days clicking to find information they could simply ask for out loud. LLMs are now turning this logic on its head. Welcome to the era of the conversational TMS.

Twenty years of TMS UX: click fatigue

Open any TMS on the market: you’ll find a dense cockpit, inherited from 2000s-era ERPs. Cascading filters, nested tabs, infinitely configurable columns. The promise was appealing — see everything, control everything — but it backfired on its users.

On the ground, the findings are damning:

  • A dispatcher opens an average of 15 to 20 different screens per day
  • Newcomers take several weeks to master the interface
  • 80% of paid-for features are never used
  • Custom reports often require a support ticket… or a manual Excel export

The traditional graphical interface remains relevant for managing, dispatching and validating. But it becomes a bottleneck as soon as it comes to querying data, cross-referencing KPIs, or building a specific business tool.

From dashboard to dialogue: what LLMs really change

The arrival of large language models is reshuffling the deck. The question is no longer “how do I display the information?” but “how do I converse with my information system?”

Concretely, a dispatcher no longer needs to navigate five screens to find out how many deliveries are late in Lyon. They ask the question. They no longer need to build an Excel report for Carrefour: they request it. They no longer need to wait until Monday morning to take stock: their AI copilot pushes a brief at 8 a.m. every day, with alerts, the day’s tasks and the local weather.

“How many deliveries are late today in Lyon?” — “Generate the weekly report for Carrefour.” — “Send me a recap every morning at 8 a.m. with the key points to watch.”

These are no longer features to be coded. They’re sentences. And that’s exactly what a conversational layer built on top of the TMS enables.

TMS and Conversational AI: The End of the Click, the Beginning of Dialogue

The AI copilot doesn’t replace the interface, it sits on top of it

Beware a common misconception: conversation doesn’t replace everything. A dispatcher reorganising a route by hand, an operator validating a POD with a swipe of the finger, a billing clerk editing a rate grid — they all need visual, fast and precise interfaces.

The right approach isn’t to throw dashboards out the window. It’s to add an AI layer on top of the traditional experience. The click and the dialogue coexist. Each does what it does best.

What the graphical interface still does better

  • Drag-and-drop an order onto a route
  • Visualise a heatmap of delivery density
  • Compare two schedules side by side
  • Quickly enter structured data

What conversation does significantly better

  • Query data in natural language
  • Generate an ad hoc report in seconds
  • Schedule a personalised daily recap
  • Create a mini business app without coding
  • Cross-reference information across multiple views

Walter, Everest’s AI dispatch agent

At Everest, this philosophy has a name: Walter. It’s the AI copilot natively integrated into the TMS, accessible from any screen. Walter isn’t a cobbled-together chatbot: it’s an agent connected to all of the system’s data — orders, routes, drivers, invoices, PODs, KPIs.

Ask it “who are my 5 most delayed customers this month?” and it answers. Ask it “generate a summary invoice for customer X” and it executes. Dictate a new urgent order by voice from your car, and it creates it.

Walter is the AI logistics operations assistant that turns every operator into an analyst, and every executive into a data-driven manager — without three weeks of training.

Walter Apps: vibe coding for the business

The next step is even more powerful. With Walter Apps, any user can create their own mini business app directly inside the TMS, simply by describing it in natural language.

Real-world examples from the field:

  • “Create a form for me to record WEEE returns with a photo and serial number”
  • “Build a screen that shows today’s POD anomalies, sorted by driver”
  • “Generate a time-tracking app for my freelance couriers”

Walter generates the app, integrates it into the interface, and connects it to the data. The CIO is no longer the bottleneck. The business takes back control. This is what’s called vibe coding: describe the intent, let the AI produce the tool.

TMS and Conversational AI: The End of the Click, the Beginning of Dialogue

How Everest addresses these challenges

Everest has chosen a hybrid TMS: a robust traditional interface + a Walter conversational layer on top. The best of both worlds.

  • Walter, native AI copilot: ask questions, request reports, dictate orders by voice. Walter accesses all your data in real time and answers in natural language.
  • Walter Apps (built-in vibe coding): build custom business apps in a few sentences, without a line of code. Forms, tracking screens, specific workflows — everything is generated on demand.
  • Smart reporting and proactive notifications: schedule daily briefs, contextual alerts, and weekly recaps sent automatically by email or Slack.
  • Podchecker.ai for proof of delivery: automated verification of signatures, photos and parcel condition with 99% accuracy and up to 85% time saved on quality control.
  • Native n8n automation: 800+ connectable applications, no-code workflows, an average of 3 hours saved per day and 95% fewer data-entry errors.
  • Complete graphical interface: visual dispatch, multi-criteria route optimisation, interactive dashboards, Sherpas driver mobile app. Everything a modern TMS should offer, plus the AI layer.

A cultural shift as much as a technological one

Adopting a conversational TMS isn’t just about saving time. It’s about changing the relationship between humans and their tool. The dispatcher is no longer a software operator: they become a decision-maker, delegating mechanical tasks to an agent and focusing their energy on what creates value — decisions, exceptions, customers.

For leadership, the stakes are threefold:

  • Accelerated onboarding: a new dispatcher becomes productive in days, not weeks
  • Democratisation of data: you no longer need to be an analyst to query your KPIs
  • Business agility: every team builds its own tools without depending on IT

Key takeaways

  • Dashboards with 50 widgets are a thing of the past: conversation is emerging as the natural new interface for TMS platforms
  • Conversational AI doesn’t replace the graphical interface, it layers on top of it to handle what clicking does poorly
  • With Walter, any operator can query their data, generate reports and schedule alerts in natural language
  • Walter Apps lets every team build their own business apps without coding, simply by describing their need
  • The gain isn’t just productivity: it’s a new relationship between humans and their information system

Choosing a TMS in 2025 is no longer just about comparing features. It’s about choosing an interaction philosophy. And in this shift, those who embed conversational AI at the heart of their operations will gain an edge that others will take years to catch up on.