What you will learn

  • Why many AI chatbots fail and what Veracross did to make theirs succeed
  • How Knowledge-Centered Service methodology makes AI smarter
  • The 18-month internal development and testing process before customer release
  • How the assistant intelligently routes unresolved issues into support cases
  • What real customers are saying about the quality of the answers

Three months in, our new VC AI Service Assistant is resolving 95% of customer questions on the first try, and customers are responding with a level of enthusiasm you just don’t encounter for a support chatbot. This didn’t come from moving fast and breaking things. It came from eighteen months of careful, thoughtful work.

The Honest Picture

We are very aware that schools, like many organizations, are being asked to embrace AI at a moment when public response to AI is, at best, mixed. Across classrooms, administrative offices, and IT departments, educators have had to reckon with AI tools that create new questions as fast as they answer old ones.

Veracross’s goal is to bring care and clarity to the work of answering these questions for the good of K-12 private and independent school communities. Our customers are busy people who care about getting things right for the sake of their students and families. They rightly have neither time nor patience for AI technology that doesn’t work reliably and well.

We knew, based on customer feedback, that the quality of the knowledge base in our Veracross Community was powerful, but that the volume was overwhelming. This made it an excellent candidate for the current capabilities of AI.

A Knowledge Base That Lives Up to the Name

Most support chatbots are trained on generic content and pointed at customers with fingers crossed. Ours was built on something entirely different: a knowledge base that has been carefully cultivated for years by our subject matter experts, many of whom have worked in schools themselves or spent their careers supporting them.

Veracross Director of Knowledge and Instructional Design, Ian Drummond, explained it this way in a recent LinkedIn article: “AI amplifies the characteristics of the systems beneath it. In environments with disciplined knowledge practices, it extends the reach of accumulated expertise. In environments without them, it scales existing weaknesses.”

Ian is an expert in Knowledge-Centered Service, or KCS, the methodology that underpins how we build and maintain our knowledge base. Rather than creating documentation in isolation and hoping it stays current, KCS treats knowledge as a living resource — created and refined through the act of answering real questions, validated continuously by the people closest to the work. The Consortium for Service Innovation, which developed the KCS methodology, describes this approach as one where “knowledge becomes a dynamic asset that improves with use.”

The Veracross knowledge base doesn’t just contain general guidance for our SIS. It contains answers to the specific questions that school people ask about donor acknowledgment letters, non-scheduler methods, next action dates, admission workflows, and hundreds of other topics that only make sense if you truly understand how schools work. This institutional knowledge, accumulated over nearly 20 years, is what the AI is drawing on.

Ian and his teammates brought a level of architectural intentionality to this project that made the AI deployment possible. The knowledge base wasn’t simply handed to a language model; they worked with the developers to shape how the AI uses our knowledge base. Through carefully designed system instructions, the assistant is guided to draw information from multiple relevant articles, synthesize it into a single response, and cite its sources. This allows it to construct answers tailored to a customer’s specific question and context, even when the question does not use the same terminology.

This is how VC AI Service Assistant can meet people where they are.

I LOVE the new chatbot. I have been using Veracross for almost seven years now. The biggest challenge I face when searching through support articles is using the right vocabulary. The chatbot eliminates that friction — I just chat in plain English and it comes back with the right Veracross article.

— International School of Brooklyn

It is a natural language breakthrough that shows up again and again in our customers’ feedback. You don’t have to know whether you’re asking about a “scheduler” or a “non-scheduler method.” You just describe what you’re trying to do, and the assistant figures out the rest.

Taking Time to Get it Right

Another reason that VC AI Service Assistant works is that we tested it rigorously with a unique group of people who know a lot about how schools use software before we released it: Veracross employees.

Work on the underlying AI assistant began more than 18 months before our January 2026 launch. The early use case was internal. We launched HR and IT AI assistants built on large language models and grounded in approved internal documentation. In both cases, they were designed to help Veracross employees get quick, well-documented answers to common workplace questions. This gave us a more robust demo environment.

Less than a year after deployment, these internal assistants are answering around 150 questions per month, providing answers sourced from approved, internal documents, and freeing Veracross’s HR and IT staff members to focus on more important work. They gave us a proof of concept on our hypothesis that a carefully constructed knowledge base, paired with a well-tuned AI layer, could reliably serve people at scale. The difference, Ian notes, lies in the internal setup:  “the more reliable signal comes from people who are already motivated to get the right answer, using a tool embedded in work they’re actually doing, with no particular reason to overlook failure. It is harder to engineer, but it tells you something the demo environment can’t.”

From there, the team expanded toward developing an AI-powered product support assistant, which is a more complex reasoning challenge because of the vast size of the knowledge base. Once the support assistant was ready, we deployed it internally and invited support staff and other employees to test it with a wide range of questions, giving our subject matter experts the opportunity to test continuously.

The plan was that we would investigate the failures and then work to solve the underlying issue. Except — there weren’t very many failures. 3,090 questions were answered during the internal employee product test, with a 98.1% success rate. This was when we knew we were on to something.

When the customer pilot launched in December 2025, approximately 25 schools put the product to the test, with Slack enabling tight feedback loops that gave us confidence in a strong success rate.

I’ll be honest — this is one of the best chatbots I’ve used. Rather than offering me a bunch of articles that only sort of match what I’m asking, the bot was able to articulate back at me exactly what I was asking for and gave me multiple routes to go.

— Bernard Zell Anshe Emet Day School

Having met our success rate goal, we rolled out VC AI Service Assistant to our global Veracross customer base on January 8th.

A Support Partner, Not an Answer Machine

There’s a design philosophy behind the VC AI Service Assistant that distinguishes it from the frustrating chatbot experiences most of us have encountered on commercial websites. Too often, bots are optimized to deflect to keep you away from a human agent, usually by giving you something that looks like an answer whether or not it actually is one. In contrast, we designed the VC AI Service Assistant around a different philosophy: to help and solve, not just respond.

Rather than pattern-matching a query to a single article, it searches across multiple knowledge base entries simultaneously. From there, it synthesizes the relevant information before presenting it in plain language with step-by-step guidance and links to the underlying documentation.

For many of my questions it offered to create a step-by-step process for completing the task I asked about. It also found three to four articles and put them into context for me.

— Covenant Christian Academy

Further, if you type in something simple, such as a two-word topic search like “donor letters,” it doesn’t just hand you any old result: it asks clarifying follow-up questions.

A screenshot of a query about donor letters

And when our VC AI Service Assistant can’t find an appropriate answer, it moves seamlessly to open a detailed support case. The tool attaches a transcript of the conversation, gives it a clear title, and applies the correct tags so the case routes directly to the right team, eliminating the guesswork that used to add time on both sides of the exchange.

a graphic showing a screenshot of a service ticket created by the VC AI Service Assistant

We deployed this same intelligence to help with the direct-to-case path. When a customer needs to start with a support case instead of a knowledge search, they still get the benefit of the assistant, because it operates as a smart interface for the support form itself. It can ask questions that help our customers articulate the request, customize the title and summary fields for clarity, and include a transcript of the conversation. The result is that the support team receives better, more actionable requests and can resolve complex cases more efficiently.

I especially love how it submitted a ticket on my behalf when it couldn’t answer my questions fully.

— American Heritage School

The Good News in Numbers

As of the end of March 2026, our assistant has answered 95% of questions successfully, across over 8,000 conversations from over 1,800 individuals at over 660 schools, worldwide.

According to Peak Support’s 2024 Customer Service KPI benchmarking study, the average AI chatbot resolution rate is 35%, with only the top-performing implementations reaching the category of “Best” at 95%. Our VC AI Service Assistant isn’t just above average. It’s competing at the top of the field.

I often have little questions that aren’t worth a ticket but are too complicated for Google. The AI chatbot being able to sum the articles in the knowledge base and just tell me what to do and where to do it has been a HUGE timesaver.

— Trinity Academy South Bend

Even with these good results, we had one last step: we needed to make sure that interacting with an AI assistant was a positive experience. To help us answer this question, our Director of Customer Advocacy, Cody Larkin, designed a customer survey. The key to understanding if something has real value for customers, he told us, is to use very practical language when asking for feedback. So, he focused on the most salient question: would you use this again?

We surveyed early adopters over a four-week period and received 274 responses giving us an average score of 9.1 out of 10, and overwhelmingly positive comments. The verdict: the VC AI Service Assistant is making work easier, and our customers are eager to continue using it.

The VC AI Service Assistant is deployed now in the Veracross Community, available to logged-in customers who can submit cases. If you’re a current customer and haven’t tried it yet, we’d love to hear what you think!