• Current Issue
  • Cover Articles
  • Archive
  • Virtual Issues
  • Online First
MORE
Volume 26  Issue 9,2025 2025年第26卷第9 Issue
  • Regular Papers

    Reporting on the latest advancements in shape-changing interfaces, this paper systematically analyzes plant-derived natural shape-changing phenomena and summarizes plant-inspired design strategies for shape-changing interfaces. Expert xx explored the plant-inspired shape-changing interface topic, which provides solutions to solve practical application problems across agriculture, healthcare, architecture, and robotics.

    Junzhe JI, Chuang CHEN, Boyu FENG, Ye TAO, Guanyun WANG

    Vol. 26, Issue 9, Pages: 1509-1533(2025) DOI: 10.1631/FITEE.2500118
    Abstract:Shape-changing interfaces use physical changes of shape as input or output to convey information, and interact with users. Plants are natural shape-changing interfaces, expert in adjusting their shape or modality to adapt to the environment. In this paper, plant-derived natural shape-changing phenomena are systematically analyzed. Then, several corresponding plant-inspired design strategies for shape-changing interfaces are summarized with recent advancements including material selections and syntheses, fabrication methods, and actuating mechanisms. Practical applications across diverse domains aim to prove the advantages and potential of plant-inspired shape-changing interfaces in agriculture, healthcare, architecture, robotics, etc. Furthermore, the opportunities and challenges are also discussed, such as design thinking in interdisciplinary tasks, dynamic behavior and control principles, novel materials and processes, application scenario and functionality matching, and large-scale application requirements. This paper is expected to inspire in-depth research on plant-inspired shape-changing interfaces.  
    Keywords:Shape-changing interfaces;Tangible interfaces;Botanical bionics;Human–computer interaction;Smart materials   
    43
    |
    13
    |
    0
    <HTML>
    <L-PDF><Meta-XML>
    <Citation> <Bulk Citation> 129480727 false
    Updated:2025-10-13
  • Regular Papers

    In the field of electronic design automation (EDA), this paper introduces its research progress in automatic schematic generation (ASG). Expert xx comprehensively combs through the existing ASG, offers in-depth description of the core algorithms of the technology—layout and routing, and for the application of the technology in PCB reverse engineering, analyzes in detail the current challenges and the faced problems. Feasible solutions are discussed in this paper, which lays a foundation for the construction of automatic PCB schematic generation technology.

    Jie YANG, Kai QIAO, Jian CHEN, Chen CHEN, Lixiang GUO, Bin YAN

    Vol. 26, Issue 9, Pages: 1534-1550(2025) DOI: 10.1631/FITEE.2400612
    Abstract:The printed circuit board (PCB) stands as the cornerstone of electronic equipment, with its schematic holding paramount importance for system performance and reliability. In light of the pervasive use of electronic devices in society, concerns regarding maintenance, safety, backdoors, and other latent issues have garnered significant attention. Automatic schematic generation (ASG), with its distinct capability for generating circuit schematics autonomously, not only plays a pivotal role in electronic design automation (EDA) but also aids in deciphering the fundamental principles of PCB equipment to effectively address these underlying issues. However, constrained by the increasingly sophisticated manufacturing processes of PCBs and the inherent legal and ethical controversies surrounding reverse engineering, the development of related technologies faces notable bottlenecks. To break through technical barriers and advance technological progress, this paper comprehensively combs through the existing ASG, offers in-depth description of the core algorithms of the technology—layout and routing, and for the application of the technology in PCB reverse engineering, analyzes in detail the current challenges and the faced problems. Around these challenges, feasible solutions are discussed in this paper, with the aims of promoting the research of automatic PCB schematic generation technology and contributing new strength to EDA and PCB reverse engineering automation.  
    Keywords:Automatic schematic generation;Layout;Routing;Printed circuit board;Reverse engineering;Automation   
    26
    |
    13
    |
    0
    <HTML>
    <L-PDF><Meta-XML>
    <Citation> <Bulk Citation> 129480720 false
    Updated:2025-10-13
  • Regular Papers

    Jiayi GUI, Zhongnan MA, Hao ZHOU, Yan SU, Miaoru ZHANG, Ke YU, Xiaofei WU

    Vol. 26, Issue 9, Pages: 1551-1576(2025) DOI: 10.1631/FITEE.2400467
    Abstract:The advancement of the fifth generation (5G) mobile communication and Internet of Things (IoT) has facilitated the development of intelligent applications, but has also rendered these networks increasingly complex and vulnerable to various targeted attacks. Numerous anomaly detection (AD) models, particularly those using deep learning technologies, have been proposed to monitor and identify network anomalous events. However, the implementation of these models poses challenges for network operators due to lacking expert knowledge of these black-box systems. In this study, we present a comprehensive review of current AD models and methods in the field of communication networks. We categorize these models into four methodological groups based on their underlying principles and structures, with particular emphasis on the role of recent promising large language models (LLMs) in the field of AD. Additionally, we provide a detailed discussion of the models in the following four application areas: network traffic monitoring, networking system log analysis, cloud and edge service provisioning, and IoT security. Based on these application requirements, we examine the current challenges and offer insights into future research directions, including robustness, explainability, and the integration of LLMs for AD.  
    Keywords:Anomaly detection;AIOps;Large language models;Communication networks   
    23
    |
    12
    |
    0
    <HTML>
    <L-PDF><Meta-XML>
    <Citation> <Bulk Citation> 129480728 false
    Updated:2025-10-13
  • Regular Papers

    In the field of optimization algorithms, the adaptive dung beetle optimizer (ADBO) has made significant progress. Expert researchers have established the ADBO system, which provides solutions to solve large-scale complex optimization problems, opening up a new direction for optimization algorithm research.

    Lixin MIAO, Zhenxue HE, Xiaojun ZHAO, Yijin WANG, Xiaodan ZHANG, Kui YU, Limin XIAO, Zhisheng HUO

    Vol. 26, Issue 9, Pages: 1577-1595(2025) DOI: 10.1631/FITEE.2400967
    Abstract:The dung beetle optimizer (DBO) is a metaheuristic algorithm with fast convergence and powerful search capabilities, which has shown excellent performance in solving various optimization problems. However, it suffers from the problems of easily falling into local optimal solutions and poor convergence accuracy when dealing with large-scale complex optimization problems. Therefore, we propose an adaptive DBO (ADBO) based on an elastic annealing mechanism to address these issues. First, the convergence factor is adjusted in a nonlinear decreasing manner to balance the requirements of global exploration and local exploitation, thus improving the convergence speed and search quality. Second, a greedy difference optimization strategy is introduced to increase population diversity, improve the global search capability, and avoid premature convergence. Finally, the elastic annealing mechanism is used to perturb the randomly selected individuals, helping the algorithm escape local optima and thereby improve solution quality and algorithm stability. The experimental results on the CEC 2017 and CEC 2022 benchmark function sets and MCNC benchmark circuits verify the effectiveness, superiority, and universality of ADBO.  
    Keywords:Metaheuristic algorithm;Dung beetle optimizer;Convergence factor;Greedy difference optimization strategy;Elastic annealing mechanism   
    18
    |
    14
    |
    0
    <HTML>
    <L-PDF><Meta-XML>
    <Citation> <Bulk Citation> 129332995 false
    Updated:2025-10-13
  • Regular Papers

    In the field of large-scale digital integrated circuit design, a novel pseudo-circuit generation algorithm based on graph topology has been proposed. This algorithm efficiently produces a multitude of power analysis examples, providing solutions to address the demand for extensive datasets in electronic design automation. Expert xx verified the effectiveness of the dataset, which underscores the research value of the algorithm.

    Zejia LYU, Jizhong SHEN, Xi CHEN

    Vol. 26, Issue 9, Pages: 1596-1608(2025) DOI: 10.1631/FITEE.2400677
    Abstract:Average power analysis plays a crucial role in the design of large-scale digital integrated circuits (ICs). The integration of data-driven machine learning (ML) methods into the electronic design automation (EDA) fields has increased the demand for extensive datasets. To address this need, we propose a novel pseudo-circuit generation algorithm rooted in graph topology. This algorithm efficiently produces a multitude of power analysis examples by converting randomly generated directed acyclic graphs (DAGs) into gate-level Verilog pseudo-combinational circuit netlists. The subsequent introduction of register units transforms pseudo-combinational netlists into pseudo-sequential circuit netlists. Hyperparameters facilitate the control of circuit topology, while appropriate sequential constraints are applied during synthesis to yield a pseudo-circuit dataset. We evaluate our approach using the mainstream power analysis software, conducting pre-layout average power tests on the generated circuits, comparing their performance against benchmark datasets, and verifying the results through circuit topology complexity analysis and static timing analysis (STA). The results confirm the effectiveness of the dataset, and demonstrate the operational efficiency and robustness of the algorithm, underscoring its research value.  
    Keywords:Graph computation;Electronic design automation (EDA);Pseudo-dataset;Average power analysis   
    17
    |
    12
    |
    0
    <HTML>
    <L-PDF><Meta-XML>
    <Citation> <Bulk Citation> 129480729 false
    Updated:2025-10-13
  • Regular Papers

    In the field of cloud computing, this study introduces an efficient privacy-preserving secure neural network inference scheme. Expert xx proposed an approach of merging network parameters and a fast convolution algorithm, which reduces the linear operation time in the online stage by at least 11%, significantly reducing inference time and communication overhead.

    Liquan CHEN, Zixuan YANG, Peng ZHANG, Yang MA

    Vol. 26, Issue 9, Pages: 1609-1623(2025) DOI: 10.1631/FITEE.2400371
    Abstract:The increasing adoption of smart devices and cloud services, coupled with limitations in local computing and storage resources, prompts numerous users to transmit private data to cloud servers for processing. However, the transmission of sensitive data in plaintext form raises concerns regarding users' privacy and security. To address these concerns, this study proposes an efficient privacy-preserving secure neural network inference scheme based on homomorphic encryption and secure multi-party computation, which ensures the privacy of both the user and the cloud server while enabling fast and accurate ciphertext inference. First, we divide the inference process into three stages, including the merging stage for adjusting the network structure, the preprocessing stage for performing homomorphic computations, and the online stage for floating-point operations on the secret sharing of private data. Second, we propose an approach of merging network parameters, thereby reducing the cost of multiplication levels and decreasing both ciphertext–plaintext multiplication and addition operations. Finally, we propose a fast convolution algorithm to enhance computational efficiency. Compared with other state-of-the-art methods, our scheme reduces the linear operation time in the online stage by at least 11%, significantly reducing inference time and communication overhead.  
    Keywords:Secure neural network inference;Convolutional neural network;Privacy-preserving;Homomorphic encryption;Secret sharing   
    18
    |
    20
    |
    0
    <HTML>
    <L-PDF><Meta-XML>
    <Citation> <Bulk Citation> 129480810 false
    Updated:2025-10-13
SEE MORE

Videos

  • 2023 Issue 1 | Scalability and efficiency challenges for the exascale supercomputing system: practice of a parallel supporting environment on the Sunway exascale prototype system 00:02:51

    2023 Issue 1 | Scalability and efficiency challenges for the exascale supercomputing system: practice of a parallel supporting environment on the Sunway exascale prototype system

    2023-12-30
    Play Total: 23
  • 2023 Issue 6 | Model division multiple access for semantic communications 00:02:30

    2023 Issue 6 | Model division multiple access for semantic communications

    2023-12-30
    Play Total: 13
  • 2022 Issue 10 | Discussion on a new paradigm of endogenous security towards 6G networks 00:02:15

    2022 Issue 10 | Discussion on a new paradigm of endogenous security towards 6G networks

    2023-12-30
    Play Total: 2
  • 2022 Issue 12 | Technology trends in large-scale high-efficiency network computing 00:02:22

    2022 Issue 12 | Technology trends in large-scale high-efficiency network computing

    2023-12-30
    Play Total: 2
  • 2022 Issue 6 | Self-deployed execution environment for high performance computing 00:02:48

    2022 Issue 6 | Self-deployed execution environment for high performance computing

    2022-08-03
    Play Total: 8
  • 2022 Issue 2 | A full-process intelligent trial system for smart court 00:02:24

    2022 Issue 2 | A full-process intelligent trial system for smart court

    2022-05-17
    Play Total: 8
  • 2022 Issue 3 | Automatic protocol reverse engineering for industrial control systems with dynamic taint analysis 00:02:37

    2022 Issue 3 | Automatic protocol reverse engineering for industrial control systems with dynamic taint analysis

    2022-05-17
    Play Total: 5
  • P1 Speech by Academician Baoyan Duan 00:05:36

    P1 Speech by Academician Baoyan Duan

    2022-04-17
    Play Total: 12
  • P2 Speech by Professor Min  Sheng, Xidian University 00:02:27

    P2 Speech by Professor Min Sheng, Xidian University

    2022-04-17
    Play Total: 7
  • P3 Speech by Professor Yunsong Li, Xidian University 00:02:37

    P3 Speech by Professor Yunsong Li, Xidian University

    2022-04-17
    Play Total: 11
SEE MORE

0