2025

Lunar-Bench: Evaluating Task-Oriented Reasoning of LLMs in Lunar Exploration Scenarios
Lunar-Bench: Evaluating Task-Oriented Reasoning of LLMs in Lunar Exploration Scenarios

Xin-Yu Xiao, Ye Tian, Yalei Liu, Xiangyu Liu, Tianyang Lu, Erwei Yin, Qianchen Xia, Shanguang Chen

Submitted to NeurIPS 2025 Under Review

This paper introduces Lunar-Bench, the first benchmark designed to rigorously evaluate the reasoning and decision-making capabilities of large language models (LLMs) under the unique constraints of lunar exploration. Featuring 3,000 high-fidelity tasks across critical lunar operational domains, Lunar-Bench goes beyond accuracy metrics by proposing Environmental Scenario Indicators (ESI), which assess models' safety, efficiency, factual integrity, and alignment. Evaluations of 36 state-of-the-art LLMs reveal significant performance gaps compared to human experts, underscoring the urgent need for robust, domain-adapted solutions in mission-critical AI deployment.

Lunar-Bench: Evaluating Task-Oriented Reasoning of LLMs in Lunar Exploration Scenarios

Xin-Yu Xiao, Ye Tian, Yalei Liu, Xiangyu Liu, Tianyang Lu, Erwei Yin, Qianchen Xia, Shanguang Chen

Submitted to NeurIPS 2025 Under Review

This paper introduces Lunar-Bench, the first benchmark designed to rigorously evaluate the reasoning and decision-making capabilities of large language models (LLMs) under the unique constraints of lunar exploration. Featuring 3,000 high-fidelity tasks across critical lunar operational domains, Lunar-Bench goes beyond accuracy metrics by proposing Environmental Scenario Indicators (ESI), which assess models' safety, efficiency, factual integrity, and alignment. Evaluations of 36 state-of-the-art LLMs reveal significant performance gaps compared to human experts, underscoring the urgent need for robust, domain-adapted solutions in mission-critical AI deployment.

Lunar Twins: We Choose to Go to the Moon with Large Language Models
Lunar Twins: We Choose to Go to the Moon with Large Language Models

Xin-Yu Xiao, Yalei Liu, Xiangyu Liu, Zengrui Li, Erwei Yin, Qianchen Xia

Annual Meeting of the Association for Computational Linguistics (ACL) 2025 Accepted

This paper presents Lunar Twins, the first large language models specifically designed for lunar exploration. The system includes the Chang’e and Yutu models, introduces a collaborative multi-agent workflow (Lunar_GenData), and establishes the first specialized lunar dataset integrating data from the Chang’e missions. Extensive experiments show that Lunar Twins significantly outperform comparable models in domain expertise and hint at embodied intelligence potential.

Lunar Twins: We Choose to Go to the Moon with Large Language Models

Xin-Yu Xiao, Yalei Liu, Xiangyu Liu, Zengrui Li, Erwei Yin, Qianchen Xia

Annual Meeting of the Association for Computational Linguistics (ACL) 2025 Accepted

This paper presents Lunar Twins, the first large language models specifically designed for lunar exploration. The system includes the Chang’e and Yutu models, introduces a collaborative multi-agent workflow (Lunar_GenData), and establishes the first specialized lunar dataset integrating data from the Chang’e missions. Extensive experiments show that Lunar Twins significantly outperform comparable models in domain expertise and hint at embodied intelligence potential.

Integrating Wavelet Transforms into Image Reconstruction Networks for Effective Style Transfer
Integrating Wavelet Transforms into Image Reconstruction Networks for Effective Style Transfer

Yunfei Chu, Xin-Yu Xiao, Longchen Han, Yaoshun Yue, Maohai Lin

Journal of Imaging Science and Technology 2025 Published

This paper presents an effective method for image style transfer by integrating wavelet transforms into whitening and coloring processes within image reconstruction networks. The proposed Wavelet Transfer Network (WTN) directly aligns the feature covariance of content and style images, yielding high-quality stylized outputs with enhanced efficiency and generalization. Experimental results demonstrate the superiority of WTN over existing methods in both arbitrary and photorealistic style transfer, setting a new benchmark in the field.

Integrating Wavelet Transforms into Image Reconstruction Networks for Effective Style Transfer

Yunfei Chu, Xin-Yu Xiao, Longchen Han, Yaoshun Yue, Maohai Lin

Journal of Imaging Science and Technology 2025 Published

This paper presents an effective method for image style transfer by integrating wavelet transforms into whitening and coloring processes within image reconstruction networks. The proposed Wavelet Transfer Network (WTN) directly aligns the feature covariance of content and style images, yielding high-quality stylized outputs with enhanced efficiency and generalization. Experimental results demonstrate the superiority of WTN over existing methods in both arbitrary and photorealistic style transfer, setting a new benchmark in the field.