AI
AI News

Enhancing Building Semantics Preservation in AI Model Training with Large Language Model Encodings

Source:arXiv
Original Author:Suhyung Jang et al.
Enhancing Building Semantics Preservation in AI Model Training with Large Language Model Encodings

Image generated by Gemini AI

A new study reveals that using large language model (LLM) embeddings improves AI training for building semantics in the architecture, engineering, construction, and operation (AECO) industry. Testing on 42 building object subtypes, the approach outperformed traditional one-hot encoding, with the llama-3 compacted embedding achieving a weighted average F1-score of 0.8766. This method enhances AI's ability to interpret complex semantics, indicating significant potential for broader application in AECO tasks.

Advancements in AI Model Training for Building Semantics

Recent research highlights a breakthrough in the architecture, engineering, construction, and operation (AECO) industry, focusing on enhancing building semantics representation in AI model training. By utilizing large language model (LLM) embeddings, the study reveals significant improvements in AI systems' ability to comprehend relationships among building object subtypes.

Methodology and Findings

The research involved training GraphSAGE models to classify 42 building object subtypes within five high-rise building information models (BIMs). Results indicated that LLM encodings significantly outperformed the conventional one-hot encoding baseline, with the llama-3 compacted embedding achieving a weighted average F1-score of 0.8766, surpassing the 0.8475 score for one-hot encoding.

Implications for the AECO Industry

The findings underscore the potential of LLM-based encodings to enhance AI's capacity to interpret complex, domain-specific building semantics.

Related Topics:

building semanticsAI model traininglarge language modelGraphSAGE modelsembedding dimensions

📰 Original Source: https://arxiv.org/abs/2602.15791v1

All rights and credit belong to the original publisher.

Share this article