5 min read

Podcast: Responsible AI in Law: Building Frameworks, Use Cases, and Trust with RAILS

Podcast: Responsible AI in Law: Building Frameworks, Use Cases, and Trust with RAILS

Everyone is talking about AI, but we should spend just as much time discussing its responsible use. Fringe Legal sat down with some of the individuals involved with RAILS (Responsible AI in Legal Services).

RAILS is a thoughtful, collaborative effort to ensure AI is used ethically and effectively across the legal sector. Here are some of the key takeaways from our conversation.

Thanks to our guests, Kelli Raker, Eli Makus, and Leigh Zeiser

Watch the video version of the podcast

🎧 Or listen here


AI Literacy Is the New Legal Competency

One message was loud and clear. If you’re a legal professional, AI literacy is not optional. You don't need to become an expert, but you do need to understand how AI tools work, what risks they introduce, and how to discuss them with clients.

 "AI literacy is a great endeavor." - Leigh Zeiser

It's about understanding what generative AI can do and what it can't. This is critical for setting the right client expectations and making smart decisions about how and when to use these tools.

Whether you’re in a large law firm, a solo practice, or legal aid, investing in AI literacy is part of being a competent professional today. Start small. Create space for conversations in your team. Share resources. Ask questions out loud.

Don't Build Alone

A recurring theme from the guests was that too many people are trying to figure this out in isolation. That’s not sustainable. Innovation moves faster when it’s shared.

RAILS provides a model worth considering. Their working groups unite people from different corners of the legal ecosystem: large firms, legal aid, technologists, and academics. They build together and share what they learn.

You don’t have to wait for a formal invite (they are looking for volunteers). You can create your own trusted group. Maybe it's across departments. Perhaps it’s a few firms in your region or your building. Having a place to talk openly about what’s working and what’s not matters.

Risk Is Part of the Deal

The guests didn’t shy away from the risks of AI. They leaned into them. But they also reminded us that there’s a risk in not using AI at all. If you’re not experimenting now, you may fall behind in ways that matter to your clients.

"We decided to just start building our own tool internally, mostly so we understood it from the bottom up. So we knew where our data was going to be. We understood how the language models would function, how we could modify prompts and back-end work so that we get a better result." - Eli Makus

Eli shared that his team decided to build their internal AI tool to learn. They wanted to understand how the technology worked, where their data was going, and how to control the results.

This is a smart move. While not everyone can or should build their own tools, everyone can choose to understand how their tools function. The key is intentionality.

Equity Needs to Be in the Room from the Start

One of the most powerful points came from Kelli Raker, who noted that the risk isn't just misuse and unequal access. Generative AI has the power to democratize access.

Legal aid organizations and smaller firms often don't have the resources to experiment with AI. But their clients face just as many, if not more, challenges. If AI is going to reshape legal services, we need to make sure it's not only improving outcomes for those with the most access.

That means sharing tools, publishing frameworks, and open-source thinking. RAILS is doing that, but more organizations can and should follow suit.

Use Cases Are Learning Tools

RAILS has been developing a library of use cases. Each one is designed to spark thinking. It outlines a legal task, how AI could support it, what benefits might emerge, and what risks should be considered.

The real value is in the conversation they provoke. Start collecting your internal use cases. Even simple ones. Talk about them as a team. Look at where they go right, where they fall short, and what you need to be comfortable using them in practice.

Responsibility Is a Team Sport

If one big idea underpinned the whole discussion, it’s this: Responsibility in legal AI doesn’t happen on its own. It occurs when people build frameworks, share lessons, and approach innovation together.

The guests didn’t claim to have all the answers, but they were asking the right questions and doing it in public.

That’s something more of us can do. Whether at a major firm or starting out, you have a role. You can help set the tone for how AI gets used. You can make sure responsibility stays part of the conversation.


Resources mentioned

Guest Bios

Eli Makus is the Managing Partner of Van Dermyden Makus Law Corporation, a preeminent investigations Firm with attorneys across California and Arizona. Van Dermyden Makus is devoted to promoting fair workplaces and safe campuses through industry-leading neutral fact-finding services. Under Eli's leadership, the Firm is committed to being an industry leader by innovating through technology and the delivery of legal services. Drawing upon his extensive employment law background, Eli conducts complex and sensitive investigations involving a variety of workplace complaints for public and private employers throughout California. Eli regularly trains on employment law topics, including training internal and external investigators on how to conduct impartial workplace investigations. Eli is the Immediate Past President for the Association of Workplace Investigators (AWI) Board of Directors and regularly serves as Faculty for AWI’s ANAB (formerly ANSI)-accredited Training Institute.


Through the Duke Center on Law & Tech, Kelli Raker manages programs and initiatives including the RAILS – Responsible A.I. in Legal Services network and design thinking sessions for law students. She has served as managing director of the justice tech accelerator, the Duke Law Tech Lab, and published a report about her research on justice tech. Kelli was a 2021 Women of Legal Tech honoree. 


In her role as Sr. Manager of Legal Tech Consulting with the AMLAW100 law firm BakerHostetlerLeigh Zeiser champions the innovation of processes to help the firm and its clients and create a competitive advantage.  Leigh works with a myriad of legal technologies but has deep expertise in AI-enabled solutions, including generative AI, automation solutions such as robotic process automation and low/no code automation solutions, and in legal operations workflow redesign.  She leads a team of tech enthusiasts delivering services to operationalize legal guidance across the practice spectrum.  Leigh engages in strategic analyses of people, process and technology to build trust, manage change and drive process adoption within each innovative initiative.

Become a Fringe Legal member

Sign in or become a Fringe Legal member to read and leave comments.
Just enter your email below to get a log in link.