
The world of legal tech has seen many shifts in recent years, and the most prominent today is the shift towards AI. New tools aimed specifically at law firms seem to appear each day. At the same time, the pressure to choose something, anything, to avoid falling behind builds and builds. It is worth stepping back to recognize that not every tool is the same, and not every tool is right for your firm.
As builders of AI tools, we have seen the struggles our clients face in finding and adopting the right tools for their workflows. We learned from their experience and aim to share these insights. This article offers a framework of questions and criteria to help you sift through the barrage of solutions and identify the one that can best address your firm’s problems.
The recent wave of legal AI tools is being driven by two things. First, it is now much easier to build and launch software using AI coding tools, which means new products are hitting the market faster than ever. Second, the amount of information about these AI legal tools on social media and Google searches has exploded, making it difficult to tell which tools actually work and which are just well-marketed.
The issue is that many of these tools are built quickly without the underlying engineering needed to support real legal workflows. They may look polished in a demo, but fall short when it comes to reliability, integrations, and day-to-day use inside a firm. That is why it is important to evaluate whether a tool is solving a specific problem in your practice or simply repackaging existing AI with a legal label. The next section breaks down how to spot the difference.
Vendors aim to schedule a demo call to walk you through their product on their terms. These calls are very helpful in understanding their capabilities, but it’s important to join these calls with questions in mind. These questions surface how a tool actually performs in practice.
Asking these types of questions may extend an introductory call or lengthen the otherwise “quick” demo conversation, but the cost of missing a red flag far outweighs the time spent pressing for clarity. If a provider controls the conversation with excessive technical jargon, that is also instructive. A demo should feel like a practical discussion about your workflow and your cases. If the company claims to focus on law firms, they should be able to explain their product in the language you use every day.
Additionally, providers that rely on recorded videos of their product at work or highly defined test cases are often not prepared for the nuances and edge cases that arise in real matters. A startup may demonstrate a clean scenario that works on a whiteboard, but your firm operates in the real world- a far more variable environment.
Vendors who solely focus on making employees more productive, operating around the clock, or generating predefined outputs are missing the point. Productivity and efficiency gains from AI are the obvious, and well, easiest achievement.
A polished demo is not the same as testing a tool within the realities of your firm’s operations. Most AI products look impressive in a controlled environment. The real question is how they perform when placed inside your workflow, with your data, and under the time pressures your team deals with every day. For that reason, it’s best to approach demos as controlled experiments. Here are some examples:
If you are evaluating a drafting tool, provide an actual intake from a recent matter and ask it to produce the corresponding document. If the tool claims to summarize medical records, upload a representative file, and evaluate the output against what your paralegal would produce. Real inputs reveal real limitations.
Define what success looks like in measurable terms. Without predefined criteria, it becomes easy to be impressed without being convinced. Having expectations of what processes you are aiming to improve will help you. Expectations like how much time this should realistically save, what accuracy threshold is acceptable, or how intuitive the interface must be for your team to actually use the product.
Paralegals, junior attorneys, and other staff members often see issues that decision makers miss. If they find the tool confusing or disruptive, adoption will stall regardless of how strong the technology appears in a leadership meeting.
Do not rely on a pre-recorded example or a tightly managed walkthrough. Ask questions in a way a real prospective client would. Interrupt it. Change direction mid-conversation. The way the system handles unpredictability will tell you far more than a scripted exchange ever could.
It is tempting to frame every AI discussion around cost reduction. For small and mid-sized firms in particular, expense control is a constant priority. Yet if cost savings are the primary lens, you risk undervaluing what AI can actually do and overpaying for a simple tool.
Any credible AI system will process information faster than a human performing the same repetitive task. The more important question is whether the tool improves outcomes that matter to your firm’s strategy.
Does it improve case preparation quality, which in turn supports stronger settlement positions? Does it expand firm capacity by allowing attorneys to focus on high-value analysis rather than administrative drafting? If it’s an intake tool, does it actually increase client conversion rates by ensuring every intake call is answered and properly qualified? These are strategic gains, not incremental efficiencies that any off-the-shelf tool can provide.
Consider defining concrete success metrics before implementation. These could include reduced turnaround time for pleadings, higher billable productivity per attorney, improved client satisfaction scores, or an increased percentage of qualified leads that become signed cases. When an AI tool is evaluated against these broader metrics, the conversation shifts from checking boxes to improving the measurable performance of your firm.
Elevating your evaluation criteria forces vendors to demonstrate alignment with your growth objectives rather than simply showcasing a flashy, new AI feature.
More tools will come, and many will sound compelling. That will not change. What can change is how your firm evaluates them. Technology should follow strategy. If a tool is not tied to a specific business outcome, it is not worth your time.
Before booking another demo, define what you are trying to improve. Be concrete. Intake conversion, drafting speed with quality, and attorney capacity. Identify the metric that proves it is working. With that clarity, vendor conversations become straightforward. You stop reacting to features and start evaluating fit. Some tools will meet that standard; most will not. That discipline is what separates firms that benefit from AI from those that waste time or fall behind.

Jacob Adberstein
Co-Founder and CEO | Reflekt Legal
Jacob Adberstein is the Co-Founder and CEO of Reflekt Legal, where he builds AI intake employees for law firms. His work focuses on helping firms improve client response, increase conversion rates, and streamline intake through tailored AI systems.
He provides AI intake solutions for personal injury, family law, criminal defense, immigration, and employment law firms that manage intake from first contact through qualification. Through this work, Jacob has developed a practical understanding of how AI performs across a wide range of legal workflows and operational environments.
Jacob is an active voice in the legal tech space, speaking at events including The Legal Tech Fund Summit and hosting legal tech programming during SXSW. He partners directly with firms to implement systems that improve performance and scalability.
Prior to founding Reflekt Legal, Jacob worked as an engineer in NASA’s robotics division within the Dynamic Systems Test Branch (ER5), where he developed systems designed to protect astronauts in high-risk environments. That experience continues to shape his approach to AI in legal settings, where reliability and precision are non-negotiable.
His broader vision is to enable stronger human relationships through AI by offloading repetitive, time-intensive work, allowing legal teams to focus on the client-facing and relationship driven aspects of their practice.



