Large Multi-Modal Model Cartographic Map Comprehension for Textual Locality Georeferencing

Loading...
Thumbnail Image
Date
2025-08-15
Open Access Location
Journal Title
Journal ISSN
Volume Title
Publisher
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Rights
(c) The author/s
CC BY
Abstract
Millions of biological sample records collected in the last few centuries archived in natural history collections are un-georeferenced. Georeferencing complex locality descriptions associated with these collection samples is a highly labour-intensive task collection agencies struggle with. None of the existing automated methods exploit maps that are an essential tool for georeferencing complex relations. We present preliminary experiments and results of a novel method that exploits multimodal capabilities of recent Large Multi-Modal Models (LMM). This method enables the model to visually contextualize spatial relations it reads in the locality description. We use a grid-based approach to adapt these auto-regressive models for this task in a zero-shot setting. Our experiments conducted on a small manually annotated dataset show impressive results for our approach (∼1 km Average distance error) compared to uni-modal georeferencing with Large Language Models and existing georeferencing tools. The paper also discusses the findings of the experiments in light of an LMM's ability to comprehend fine-grained maps. Motivated by these results, a practical framework is proposed to integrate this method into a georeferencing workflow.
Description
Keywords
Georeferencing, Large Language Models, Large Multi-Modal Models, LLM, Natural History collections
Citation
Wijegunarathna K, Stock K, Jones CB. (2025). Large Multi-Modal Model Cartographic Map Comprehension for Textual Locality Georeferencing. Sila-Nowicka K, Moore A, O’Sullivan D, Adams B, Gahegan M. Leibniz International Proceedings in Informatics Lipics. Schloss Dagstuhl – Leibniz-Zentrum für Informatik.
Collections