In terms of communication and network technologies, this research center aims to achieve challenging communication network specifications in latency, reliability, and number of connections for perceptual networking.

The focus is on the research and development of key technologies such as low-latency and low-information-age communication, semantic communication, IoT platforms, and management.

By integrating artificial intelligence technologies, the research moves beyond traditional coding and transmission technologies.

Among these, the development of highly efficient semantic communication technologies from the sensing end to the generation end is a forward-looking research goal.

Based on the intelligent computing characteristics of the sensing end and the generation end, this aims to achieve several orders of magnitude reduction in data volume and to meet the low-latency requirements needed for applications through the control of information age.


 
Low Latency and Low Age Communication

Low Latency and Low Age Communication: Designing low-latency decoders is a primary focus of our research under the low-latency communication requirement. Polarization codes, known for approaching channel capacity, have been adopted by fifth-generation mobile communication systems. Belief propagation decoders, capable of parallel implementation, better meet low-latency requirements than other decoders. However, their performance still lags behind successive cancellation list (SCL) decoding. To address this, we developed an enhanced SCL decoding method based on the newly proposed sparse even-column matrix, achieving performance close to SCL decoding with the most approximate lower bound. Additionally, we proposed an analysis tool and a polarization code construction method for STR decoding. Computer simulations verified that the codes constructed by our Center outperform both the polarization codes used in fifth-generation mobile communication systems and those designed for sequential elimination decoding. The research results were published in Proceedings of IEEE International Symposium on Information Theory, and Proceedings of Symposium on Information Theory.
 

Short error correction codes and Low-latency decoders
Designing short error correction codes and low-latency decoders with innovative overlapping retransmission is another critical issue. The successive cancellation list flip (SCLF) decoding algorithm achieves better error rate performance by flipping the error bits in the next decoding attempt. We propose new features to train our proposed three stacked Long Short-Term Memory (LSTM) models to assist in Sequential Elimination List Flip decoding. Specifically, the stacked LSTM flip-1 model and the stacked LSTM flip-2 model are used to predict the location of the first and second erroneous bits, respectively, while the stacked LSTM Continued Flip Checking (CFC) model aims to determine whether to continue flipping at the next erroneous bit. The structure of the deep learning model is shown in Figure B-15 below.

Simulation results demonstrate that the DL-Aided SCLF decoding algorithm, based on the proposed stacked LSTM flip-1 model, stacked LSTM flip-2 model, and stacked LSTM CFC model, provides superior performance with a lower average number of decoding attempts compared to other state-of-the-art decoding algorithms. Notably, these models are trained before decoding, so the primary increase in complexity occurs in the offline phase. The research results were published in IEEE Transactions on Cognitive Communications and Networking (Early Access).


 

Figure B-15, Flip position prediction structure based on stacked LSTM models
 
Intelligent Relay Transmission Mechanism
From the perspective of communication networks, traditional methods for reducing transmission delay focus solely on the speed of successful information delivery. This approach incurs significant transmission costs and struggles to maintain low latency in large-scale communication networks under limited bandwidth resources. Considering the "real-time nature" of information—specifically, its freshness—we have designed an intelligent relay transmission mechanism (as shown in Fig. B-16). This mechanism uses the sensed information through a relay station, and then decides the relay strategy based on the information's age and the transmission channel's quality to maintain the freshness of networked information. To provide flexibility and reduce maintenance costs, the relay station collects wireless signal energy transmitted by networked devices and recharges wirelessly. Due to wireless energy harvesting, the relay station must have sufficient power to retransmit signals. Thus, the relay station's decisions are closely tied to the timing of wireless energy harvesting, the quality of the wireless channel, and the freshness of networked messages. To address this complex problem, we use deep reinforcement learning (DRL) as the decision-making center for the relay station. Experimental results show that our proposed method improves information freshness by approximately 50% compared to existing methods. The research result was published in IEEE Wireless Communication Letters. Semantic Communication: Semantic communication is an emerging area of research that requires a shared knowledge base between the sender and receiver to enable semantic encoding and reduce communication costs.

Recently, the rapid development in artificial intelligence, especially the emergence of large-scale language models (LLMs) like OpenAI's ChatGPT and Google's Gemini, has made it possible to efficiently compress complex data, including articles and video images. These breakthroughs in AI technology have been dubbed "the emergence phenomenon of capabilities" and are attributed to two key factors: the unusually high number of parameters in the neural networks (about 175 billion) and the extensive and massive data sets (covering the entire internet).


To provide a theoretical basis for the development of semantic communication technologies and further understand the emergence phenomenon, we attempt to create mathematical models to explain this phenomenon. We have pre-published our results on a public online platform arXiv. 


Figure B-16: Intermediate Relay System for Maintaining Information Freshness
 
Microservice Allocation Algorithm (MOMA)
In terms of action nodes, we designed an algorithm that uses an unmanned aerial vehicle (UAV) to collect data from terrestrial IoT devices, each of which has a certain amount of data that can be sent to the UAV within a specific timeframe. Our goal is to plan the flight path of the UAV, collect data from the IoT devices, and maximize the profit the UAV can make. Since this problem is an NP-hard problem, we propose a three-stage heuristic algorithm to solve this problem. In the first stage, we solve the Traveling-Sales-person Problem (TSP) to find the flight trajectory of the drone. In the second stage, we propose an algorithm that allows us to change the data collection sequence of the drone or remove the IoT device from the flight path for devices that cannot meet the time constraints. In the third stage, the flight distance of the drone obtained in the second stage is reduced. The simulation results show that the proposed algorithm outperforms previous approaches in terms of total profit and execution time. In addition, we also design the algorithm for the case where the data collection point is located in a restricted region. In this case, the drone can only collect data in the periphery of the restricted region. We utilize dynamic programming to find the best flight path that can complete data collection at multiple data collection points in the shortest possible time. The results of this research were published in Proceedings of the IEEE International Conference on Communications (ICC), and IEEE Transactions on Mobile Computing.

Managing Internet of Things (IoT) resources in cloud networks is highly complex due to the diverse requirements of various application services. This year, our focus was on optimizing microservice management and resource allocation in heterogeneous cloud environments for IoT applications. To tackle this challenge, we introduced an advanced management framework called the Multi-Objective Microservice Allocation Algorithm (MOMA). MOMA addresses two crucial factors in microservice resource allocation: optimizing resource utilization and minimizing network communication costs. By transforming these objectives into a constrained optimization problem, our framework facilitates efficient resource management across diverse cloud systems. This approach streamlines cloud service deployment, simplifies workload monitoring, and enhances analytical capabilities. We conducted a comprehensive evaluation of MOMA against existing algorithms using real-world datasets. The experimental results demonstrate that MOMA significantly improves resource utilization, reduces network transmission costs, and enhances network reliability. Detailed findings will be published in IEEE Access (under review).

Figure B-18: Heterogeneous Cloud Microservice Management Framework Architecture
 
Physical Layer Authentication Technique
The broadcast nature of IoT wireless transmissions presents significant challenges for ensuring the security and authentication of large-scale devices. To address these challenges, we propose a physical layer authentication technique tailored for mobile scenarios, leveraging deep learning and channel state information. This approach illustrated in Fig. B-19 utilizes a Convolutional Neural Network (CNN) to analyze temporal and spatial similarities in channel information.

The CNN outputs a score that quantifies differences between channel states observed at different times, enabling device identity verification.
Our experimental platform, based on WiFi, facilitated a comprehensive evaluation of this technique. We studied the impact of distance between legitimate and malicious devices on authentication performance and assessed the CNN model's generalization across various test scenarios. The results highlight the effectiveness of our CNN-based authentication approach compared to traditional correlation-based methods. Detailed findings from our research were published in Proceedings of IEEE Global Communications Conference (Globecom).

Figure B-19: Schematic diagram of entity level identity authentication