25 December 2021, Volume 6 Issue 4
    

  • Select all
    |
    Review papers
  • Hengtao He, Xianghao Yu, Jun Zhang, Shenghui Song, KhaledB. Letaief
    Journal of Communications and Information Networks. 2021, 6(4): 321-335. https://doi.org/10.23919/JCIN.2021.9663100
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    The recently commercialized fifth-generation (5G) wireless networks have achieved many improvements, including air interface enhancement, spectrum expansion, and network intensification by several key technologies, such as massive multiple-input multipleoutput (MIMO), millimeter-wave communications, and ultra-dense networking. Despite the deployment of 5G commercial systems,wireless communications is still facing many challenges to enable connected intelligence and a myriad of applications such as industrial Internet-ofthings, autonomous systems, brain-computer interfaces, digital twin, tactile Internet, etc. Therefore, it is urgent to start research on the sixth-generation (6G) wireless communication systems. Among the candidate technologies for 6G,cell-free massive MIMO,which combines the advantages of distributed systems and massive MIMO,is a promising solution to enhance the wireless transmission efficiency and provide better coverage. In this paper, we present a comprehensive study on cell-free massive MIMO for 6G wireless communication networks with a special focus on the signal processing perspective. Specifically, we introduce enabling physical layer technologies for cell-free massive MIMO,such as user association,pilot assignment, transmitter, and receiver design, as well as power control and allocation. Furthermore,some current and future research problems are described.

  • Qiao Lan, Dingzhu Wen, Zezhong Zhang, Qunsong Zeng, Xu Chen, Petar Popovski, Kaibin Huang
    Journal of Communications and Information Networks. 2021, 6(4): 336-371. https://doi.org/10.23919/JCIN.2021.9663101
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    In the 1940s,Claude Shannon developed the information theory focusing on quantifying the maximum data rate that can be supported by a communication channel. Guided by this fundamental work, the main theme of wireless system design up until the fifth generation(5G)was the data rate maximization. In Shannon’s theory, the semantic aspect and meaning of messages were treated as largely irrelevant to communication. The classic theory started to reveal its limitations in the modern era of machine intelligence, consisting of the synergy between Internet-of-things (IoT) and artificial intelligence (AI). By broadening the scope of the classic communication-theoretic framework, in this article, we present a view of semantic communication (SemCom) and conveying meaning through the communication systems. We address three communication modalities:human-to-human(H2H),human-to-machine(H2M),and machine-to-machine(M2M)communications. The latter two represent the paradigm shift in communication and computing, and define the main theme of this article. H2M SemCom refers to semantic techniques for conveying meanings understandable not only by humans but also by machines so that they can have interaction and“dialogue”. On the other hand, M2M SemCom refers to effective techniques for efficiently connecting multiple machines such that they can effectively execute a specific computation task in a wireless network. The first part of this article focuses on introducing the SemCom principles including encoding, layered system architecture, and two design approaches: 1) layer-coupling design; and 2) end-to-end design using a neural network. The second part focuses on the discussion of specific techniques for different application areas of H2M SemCom [including human and AI symbiosis,recommendation,human sensing and care, and virtual reality(VR)/augmented reality (AR)]and M2M SemCom(including distributed learning, split inference,distributed consensus,and machine-vision cameras). Finally,we discuss the approach for designing SemCom systems based on knowledge graphs.We believe that this comprehensive introduction will provide a useful guide into the emerging area of SemCom that is expected to play an important role in sixth generation (6G) featuring connected intelligence and integrated sensing, computing,communication,and control.

  • Hui Liang, Wei Zhang
    Journal of Communications and Information Networks. 2021, 6(4): 372-384. https://doi.org/10.23919/JCIN.2021.9663102
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    Resource allocation is a fundamental and vital issue for arbitrary wireless networks, which can guarantee the network to run efficiently and stably. Especially, in the 5G and future 6G networks, numerous distinct wireless network service demands will be requested from various vertical industries. How to select the most suitable one from these countless methods accurately and efficiently is still a far unsolved problem. In this survey, we give a comprehensive review of the domain of resource allocation in wireless communication. The survey begins with an overview of the resource allocation problem in present wireless networks and related works about the review of resource allocation methods. We then give a novel unified framework by using a six-tuple model for existing resource allocation methods of wireless networks. According to this model, we categorize these methods into two main classifications, namely single decision maker and multiple decision maker resource allocation.Comprehensive discussion and representative research results about the above taxonomy are subsequently elaborated. Finally, the conclusion and some promising research opportunities are presented at the end of the survey paper.

  • Research papers
  • Jiaqi Huang, Yi Qian, RoseQingyang Hu
    Journal of Communications and Information Networks. 2021, 6(4): 385-395. https://doi.org/10.23919/JCIN.2021.9663103
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    Ubiquitous information exchange is achieved among connected vehicles through the increasingly smart environment. The concept of conventional vehicular ad hoc network is gradually transformed into the Internet of vehicles (IoV). Meanwhile, more and more locationbased services(LBSs)are created to provide convenience for drivers. However, the frequently updated location information sent to the LBS server also puts user location privacy at risk. Thus, preserve user location privacy while allowing vehicles to have high-quality LBSs is a critical issue. Many solutions have been proposed in the literature to preserve location privacy. However, most of them cannot provide real-time LBS with accurate location updates. In this paper, we propose a novel location privacy-preserving scheme,which allows vehicles to send accurate real-time location information to the LBS server while preventing being tracked by attackers. In the proposed scheme, a vehicle utilizes the location information of selected shadow vehicles, whose route diverge from the requester, to generate multiple virtual trajectories to the LBS server so as to mislead attackers. Simulation results show that our proposed scheme achieves a high privacy-preserving level and outperforms other state-of-the-art schemes in terms of location entropy and tracking success ratio.

  • Boyu Deng, Chunxiao Jiang, Jingchao Wang, Linling Kuang
    Journal of Communications and Information Networks. 2021, 6(4): 396-410. https://doi.org/10.23919/JCIN.2021.9663104
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    Beam scheduling is one of the most important issues regarding data relay satellite systems, which can improve the utilization efficiency of limited system resources by programming beam allocation for relay missions. The ever-increasing relay missions create a substantial challenge for beam scheduling due to an increase in different mission demands. The cooperative usage of different beams further increases the complexity of this problem. Therefore, we develop a novel optimization method to solve the beam scheduling problem for the scenario of various mission demands in the data relay satellite system (DRSS). Based on the analysis of mission demands and resource features,we first construct a heterogeneous parallel machines scheduling model to formulate the beam scheduling problem in the DRSS.To solve this complicated model,we investigate the matching method between mission demands and beam resources, and introduce two concepts, the loose duration and the number of available beams,to make the matching process more effective. Then, the following three algorithms are proposed. Our first approach,the maximized completion probability algorithm(MCPA),applies a greedy strategy based on the new concepts to allocate beams for missions;and two improved versions of this algorithm are also presented, which employ the strategies of mission insertion optimization and mission sequence optimization, respectively. Our simulation results show that the proposed algorithms are superior to the existing algorithms in terms of the scheduled missions, the weight of scheduled missions, and the processing time, which significantly improves the performance of beam scheduling in the DRSS.

  • Qianfan Wang, Suihua Cai, Li Chen, Xiao Ma
    Journal of Communications and Information Networks. 2021, 6(4): 411-419. https://doi.org/10.23919/JCIN.2021.9663105
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    This paper presents a new coding scheme called semi-low-density parity-check convolutional code (semi-LDPC-CC), whose parity-check matrix consists of both sparse and dense sub-matrices, a feature distinguished from the conventional LDPC-CCs. We propose sliding-window list (SWL) decoding algorithms with a fixed window size of two, resulting in a low decoding latency but a competitive error-correcting performance. The performance can be predicted by upper bounds derived from the first event error probability and by genie-aided (GA) lower bounds estimated from the underlying LDPC block codes (LDPC-BCs), while the complexity can be reduced by truncating the list with a threshold on the difference between the soft metrics in the serial decoding implementation. Numerical results are presented to validate our analysis and demonstrate the performance advantage of the semi-LDPC-CCs over the conventional LDPC-CCs.

  • Yulun Cheng, Wenchao Xia, Haitao Zhao, Longxiang Yang, Hongbo Zhu
    Journal of Communications and Information Networks. 2021, 6(4): 420-428. https://doi.org/10.23919/JCIN.2021.9663106
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    Due to its wide coverage, stable link, and low latency,cellular network has become one of the main access networks for Internet of things (IoT). However, the expenses of backhauls and node charging grow fast with the number of nodes. Towards this issue,this paper addresses the joint caching and user association for energy harvesting aided IoT with full-duplex backhauls. We formulate the node charging,full-duplex base station association, and cache allocation by using Stackelberg game. Then the structural characteristics are utilized to decompose the model into a two-layer knapsack problem. On the basis of that, an alternative direction iteration is proposed, which firstly compresses the solution space by the constraints, and then obtains the optimization results by alternative iterations. Simulation results verify the effectiveness of the proposed algorithm in utility improvement and cost reduction.

  • Huayan Guo, Yifan Zhu, Haoyu Ma, VincentK.N. Lau, Kaibin Huang, Xiaofan Li, Huabin Nong, Mingyu Zhou
    Journal of Communications and Information Networks. 2021, 6(4): 429-442. https://doi.org/10.23919/JCIN.2021.9663107
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save

    In this paper, we develop an orthogonal frequency-division multiplexing(OFDM)-based over-theair (OTA) aggregation solution for wireless federated learning(FL).In particular,the local gradients in massive Internet of things (IoT) devices are modulated by an analog waveform and are then transmitted using the same wireless resources. To this end, achieving perfect waveform superposition is the key challenge, which is difficult due to the existence of frame timing offset (TO) and carrier frequency offset (CFO). In order to address these issues, we propose a two-stage waveform pre-equalization technique with a customized multiple access protocol that can estimate and then mitigate the TO and CFO for the OTA aggregation. Based on the proposed solution,we develop a hardware transceiver and application software to train a real-world FL task,which learns a deep neural network to predict the received signal strength with the global positioning system information. Experiments verify that the proposed OTA aggregation solution can achieve comparable performance to offline learning procedures with high prediction accuracy.