Can AI Be Trusted to Write Construction Contracts? Lawyers Say the Risks Are Still Too High

Despite tech improvements, legal experts warn that AI-generated contracts may leave builders exposed to costly gaps and errors.

San Jose, California, 25 November 2025 – Two and a half years ago, Construction Dive asked ChatGPT to create a construction contract for a 600-unit mixed-use project in San Jose using a traditional design-bid-build model. At first glance, the document looked convincing. It included key components such as scope of work, payment structures, insurance language, change-order procedures, and termination terms. Some attorneys even said parts of it could be enforceable.

But the contract also left out crucial protections and failed to properly manage project risk. Legal experts at the time warned that relying on artificial intelligence to generate a full construction contract was risky and could lead to serious legal and financial consequences. They compared the idea to opening Pandora’s Box, unpredictable and dangerous.

Today, attorneys say AI has become far more common and is now part of everyday workflows across the industry. But the key question remains: after two years of rapid improvement, is AI finally capable of writing a dependable construction contract? And should construction lawyers use it for that purpose? Many say absolutely not.

Megan Shapiro, a construction attorney and partner at Radoslovich Shapiro in Sacramento, said she would be alarmed to learn that anyone was using AI to produce a full contract. She emphasized that construction agreements require careful drafting, project-specific details, and legal precision that AI cannot yet deliver reliably.

Other legal professionals agree. Michael Vardaro, managing partner at New York-based Zetlin & De Chiara, said contract writing is not a task where someone can simply press a button and receive a finalized document. He explained that contract language must be negotiated, reviewed, and aligned with risk, insurance obligations, performance standards, and state laws.

Accuracy concerns remain one of the biggest challenges. Shapiro said she is often surprised to see how incorrect AI-generated legal content can be, and how frequently errors appear. These mistakes are often referred to as hallucination situations, where AI produces information that sounds believable but is completely inaccurate. Reports have shown that some newer systems still show high hallucination rates depending on how they are tested.

Construction lawyers say AI can still be useful for certain tasks, such as summarizing documents, reviewing clauses, drafting outlines, researching case references, and speeding up administrative work. However, they warn that treating AI as a full contract generator could expose construction firms to disputes, insurance conflicts, compliance issues, and unenforceable terms.

For now, experts agree that human oversight remains essential. With construction projects involving complex coordination, liability exposure, scheduling pressures, and financial risk, attorneys say AI should support, not replace, professional contract drafting.

Hot Topics

Related Articles