Today 242

Yesterday 2900

All 63014486

Wednesday, 4.03.2026
Transforming Government since 2001

Researchers have developed DBAL-YOLO, a novel deep learning-based framework that automatically converts non-digital engineering drawings into 3D Building Information Models (BIM). Achieving 98.8% detection precision and complete geometric reconstruction without manual intervention, this technology resolves a challenging issue in creating digital twins for existing buildings.

Digital twin city systems serve as the foundation for intelligent urban management and resilience monitoring. These systems rely heavily on digital models of physical entities, particularly buildings. However, a significant challenge persists: many existing and older buildings lack digital 3D models, with their information stored only in scanned or archived paper drawings.

Transforming these non-digital drawings into Building Information Models (BIM) traditionally requires labour-intensive manual modelling, which is time-consuming and prone to human error. Addressing this critical gap, a research team has developed a fully automated, high-precision reconstruction framework named DBAL-YOLO.

“Efficient 3D modelling technologies for existing buildings are critical for digital twin systems,” explain the researchers. “Our goal was to create a solution that could ‘read’ complex engineering drawings and build a 3D model automatically, without relying on traditional modelling software.”

The core of this innovation is the DBAL-YOLO algorithm, an improvement upon the YOLOv11n architecture. Standard AI models often struggle to detect slender structural components like beams and columns amidst the dense linework of engineering drawings. To overcome this, the team integrated Dynamic Snake Convolution (DSConv) to enhance sensitivity to elongated structures and a Bidirectional Feature Pyramid Network (BiFPN) for efficient multi-scale feature fusion.

By incorporating an attention mechanism and a lightweight asymmetric detection head, the researchers significantly reduced the model’s computational load while improving its ability to identify key structural elements.

Tests conducted on a dataset of 3,960 annotated drawings demonstrated that DBAL-YOLO achieves a high precision of 98.8% and a recall of 98.3%.

Beyond detection, the framework introduces a unique module-based correction method. It uses Optical Character Recognition (OCR) to read dimensional text and aligns the detected components with standard architectural modules, automatically correcting geometric errors caused by scanning noise.

The final step involves a custom Python-based reconstruction engine. It combines the structural data with wall information extracted via a U-Net segmentation model to generate a complete 3D building model.

“This framework operates independently of commercial BIM software,” note the researchers. “It directly generates 3D solids from 2D pixel data, streamlining the workflow for urban renewal and facility management.”

The technology holds immense potential for the rapid digitisation of urban infrastructure, allowing city managers to easily integrate old building stock into modern digital twin platforms for better maintenance and safety monitoring.

This paper, “Automated BIM modelling from non-digital engineering drawings using improved YOLOv11n-based detection framework”, was published in Smart Construction (ISSN: 2960-2033), a peer-reviewed open access journal dedicated to original research articles, communications, reviews, perspectives, reports, and commentaries across all areas of intelligent construction, operation, and maintenance, covering both fundamental research and engineering applications. The journal is now indexed in Scopus, and article submission is completely free of charge until 2026.

Cai X, Deng J, Zhu Y, Zhao B, Mao X. Automated BIM modelling from non-digital engineering drawings using improved YOLOv11n-based detection framework. Smart Constr. 2026(1):0003, https://doi.org/10.55092/sc20260003.

---

Autor(en)/Author(s): Jenny He

Dieser Artikel ist neu veröffentlicht von / This article is republished from: EurekAlert!, 24.02.2026

Bitte besuchen Sie/Please visit:

Go to top