Abstract:Wireless big data describes a wide range of massive data that is generated,collected and stored in wireless networks by wireless devices and users.While these data share some common properties with traditional big data,they have their own unique characteristics and provide numerous advantages for academic research and practical applications.This article reviews the recent advances and trends in the field of wireless big data.Due to space constraints,this survey is not intended to cover all aspects in this field,but to focus on the data aided transmission,data driven network optimization and novel applications.It is expected that the survey will help the readers to understand this exciting and emerging research field better.Moreover,open issues and promising future directions are also identified.
The 5G (fifth generation) mobile communications aim to support a large versatile type of services with different and often diverging requirements, which has posed significant challenges on the design of 5G systems.Modulation and waveforms are one of the key physical layer componentsthat determine the system throughput, reliability, and complexity, therefore their design is critical in meeting the variety requirements of 5G services.A comprehensive overview was presented on the modulation and waveforms that have been considered for their potential application to 5G in the literature, identifying their design requirements, and discussing their advantages to meet such requirements.Additional considerations that extend our view to higher layer aspects and air interface harmonization are provided as the final remarks.As a result of this article, it is hopeful to draw greater attentions from the readers on this important topic, and trigger further studies on the promising modulation and waveform candidates.
Abstract:In HetNets (Heterogeneous Networks),each network is allocated with fixed spectrum resource and provides service to its assigned users using specific RAT (Radio Access Technology).Due to the high dynamics of load distribution among different networks,simply optimizing the performance of individual network can hardly meet the demands from the dramatically increasing access devices,the consequent upsurge of data traffic,and dynamic user QoE(Quality-of-Experience).The deployment of smart networks,which are supported by SRA (Smart Resource Allocation)among different networks and CUA (Cognitive User Access) among different users,is deemed a promising solution to these challenges.In this paper,we propose a framework to transform HetNets to smart networks by leveraging WBD(Wireless Big Data),CR(Cognitive Radio) and NFV (Network Function Virtualization) techniques.CR and NFV support resource slicing in spectrum,physical layers,and network layers,while WBD is used to design intelligent mechanisms for resource mapping and traffic prediction through powerful AI (Artificial Intelligence) methods.We analyze the characteristics of WBD and review possible AI methods to be utilized in smart networks.In particular,the potential of WBD is revealed through high level view on SRA,which intelligently maps radio and network resources to each network for meeting the dynamic traffic demand,as well as CUA,which allows mobile users to access the best available network with manageable cost,yet achieving target QoS(Quality-of-Service)or QoE.
Applications of VANETs (Vehicular Ad hoc Networks) have their own requirements and challenges in wireless communication technology.Although regarded as the first standard for VANETs,IEEE 802.11p is still in the field-trial stage.Recently,LTE V2X(Long-Term Evolution Vehicular to X)appeared as a systematic V2X solution based on TD-LTE(Time Division Long-Term Evolution)4G.It is regarded as the most powerful competitor to 802.11p.We conduct link level simulations of LTE V2X and DSRC (Dedicated Short-Range Communication)for several different types of scenarios.Simulation results show that LTE V2X can achieve the same BLER(Block Error Ratio)with a lower SNR(Signal Noise Ratio)than DSRC.A more reliable link can be guaranteed by LTE V2X,which can achieve the same BLER with lower receiving power than DSRC.The coverage area of LTE V2X is larger than that of DSRC.
Chandran, et al.introduce the direction of position based cryptography at CRYPTO 2009.In position based cryptography, the position of a party is used to be its unique “credential” in order to realize the cryptographic tasks, such as position based encryption, position based signature, position based key exchange and so on.Position based key exchange, as a basic primitive in position based cryptography, can be used to establish a shared key based on the position of the participant.To begin with, this paper presents the notions of the prover-to-verifier mode and the prover-to-prover mode for position based key exchange.In the prover-to-verifier mode, a secret key can be shared between a prover and the verifiers according to the position of the prover.While in the prover-to-prover mode, two provers located at the valid positions can negotiate a shared key with the help of the verifiers and any other party whose position is illegal cannot obtain the shared key.At the same time, this paper formalizes two security definitions against colluding adversaries: position based prover-to-verifier key exchange and position based prover-to-prover key exchange.Then, this paper introduces the bounded retrieval model and the implementations of position based key exchange in two modes based on the bounded retrieval model.Finally, this paper discusses the position based key exchange protocols in two modes from both security and performance perspectives.
In vehicular networks,the exchange of beacons among neighboring vehicles is a promising solution to guarantee a vehicle’s safety.However,frequent beaconing under high vehicle density conditions will cause beacon collisions,which are harmful to a vehicle’s driving safety and the location tracking accuracy.We propose an ABIwRC (Adaptive Beaconing Interval with Resource Coordination) method for a highway scenario.Each vehicle broadcasts beacon interval requests,including the intervals needed for both the vehicle’s driving safety and location tracking accuracy.The RSU(Road Side Unit) allocates resources for a vehicle’s beaconing according to the requests from all vehicles and the interference relationship between the vehicles in adjacent RSUs.We formulate a resource allocation problem for maximizing the sum utility,which measures the satisfaction of vehicles’requests.We then transform the optimization problem into a maximum weighted independent set problem,and propose an algorithm to solve this efficiently.Simulation results show that the proposed method outperforms the benchmark in terms of beacon reception ratio,vehicle driving safety,and location tracking accuracy.
Cloud computing facilitates convenient and on-demand network access to a centralized pool of resources.Currently, many users prefer to outsource data to the cloud in order to mitigate the burden of local storage.However, storing sensitive data on remote servers poses privacy challenges and is currently a source of concern.SE (Searchable Encryption) is a positive way to protect users sensitive data, while preserving search ability on the server side.SE allows the server to search encrypted data without leaking information in plaintext data.The two main branches of SE are SSE (Searchable Symmetric Encryption) and PEKS (Public key Encryption with Keyword Search).SSE allows only private key holders to produce ciphertexts and to create trapdoors for search, whereas PEKS enables a number of users who know the public key to produce ciphertexts but allows only the private key holder to create trapdoors.This article surveys the two main techniques of SE: SSE and PEKS.Different SE schemes are categorized and compared in terms of functionality, efficiency, and security.Moreover, we point out some valuable directions for future work on SE schemes.
In recent years,IoV (Internet of Vehicles) has become one of the most active research fields in network and intelligent transportation system.As an open converged network,IoV plays an important role in solving various driving and traffic problems by advanced information and communications technology.We review the existing notions of IoV from different perspectives.Then,we provide our notion from a network point of view and propose a novel IoV architecture with four layers.Particularly,a novel layer named coordinative computing control layer is separated from the application layer.The novel layer is used for solving the coordinative computing and control problems for human-vehicle-environment.After summarizing the key technologies in IoV architecture,we construct a VV (Virtual Vehicle),which is an integrated image of driver and vehicle in networks.VVs can interact with each other in cyber space by providing traffic service and sharing sensing data coordinately,which can solve the communication bottleneck in physical space.Finally,an extended IoV architecture based on VVs is proposed.
IoV (Internet of Vehicles) is a promising paradigm to the future of automobiles, which will undoubtedly boost the automobile market as well as accelerate innovation in Internet services and applications.The concept of SD-IoV (Software Defined IoV) is presented, which is capable of improving resource utilization, service quality, and network optimization in the harsh vehicular network environments.First, A generalized SD-IoV architecture as an intuitive big picture is presented.Then, the major functions realized by SD-IoV are elabrated on to illustrate how the current challenges are resolved.As the key enablers of SD-IoV, three possible implementation methods of the wireless control path are described and compared.Finally, the challenges and existing solutions of SD-IoV are disuessed and open issues are pointed out so as to shed light on future research.
The widespread application of heterogeneous cloud computing has enabled enormous advances in the real-time performance of telehealth systems.A cloud-based telehealth system allows healthcare users to obtain medical data from various data sources supported by heterogeneous cloud providers.Employing data duplications in distributed cloud databases is an alternative approach for achieving data sharing among multiple data users.However, this approach results in additional storage space being used, even though reducing data duplications would lead to a decrease in data acquisitions and realtime performance.To address this issue, this paper focuses on developing a dynamic data deduplication method that uses an intelligent blocker to determine the working mode of data duplications for each data package in heterogeneous cloudbased telehealth systems.The proposed approach is named the SD2M (Smart Data Deduplication Model), in which the main algorithm applies dynamic programming to produce optimal solutions to minimizing the total cost of data usage.We implement experimental evaluations to examine the adaptability of the proposed approach.
In the age of information explosion,big data has brought challenges but also great opportunities that support a wide range of applications for people in all walks of life.Faced with the continuous and intense competition from OTT service providers,traditional telecommunications service providers have been forced to undergo enterprise transformation.Fortunately,these providers have natural and unique advantages in terms of both data sources and data scale,all of which give them a competitive advantage.Multiple foreign mainstream telecom operators have already applied big data for their own growth,from internal business to external applications.Armed with big data,domestic telecom companies are also innovating business models.This paper will introduce three aspects of big data in the telecommunications industry.First,the unique characteristics and advantages of communications industry big data are discussed.Second,the development of the big data platform architecture is introduced in detail,which incorporates five crucial sub-systems.We highlight the data collection and data processing systems.Finally,three internal or external application areas based on big data analysis are discussed,namely basic business,network construction,and intelligent tracing.Our work sheds light on how to deal with big data for telecommunications enterprise development.
Abstract: With the support of the national nature science foundation, the Academy of Space Electronic Information Technology is developing a novel compact spaceborne GNSS receiver, referred to as the HiSGR (High Sensitivity GNSS Receiver). This receiver can operate effectively in the full range of Earth orbiting missions, from LEO (Low Earth Orbit) to geostationary and beyond. Improved signal detection algorithms are used in the signal process section of the HiSGR and an inertial sensor is used for GNSS/INS ultra tight coupled design, which makes the acquisition process fast and provides improved tracking performance for weaker GPS signals in the presence of high dynamics. Extensive tests are performed using the HiSGR to demonstrate the good performance of some crucial specifications, by employing a real GNSS signal received in an open field and through hardware-in-the-loop simulation. Receiver performance is demonstrated for LEO and GEO scenarios. A ground vehicle running test is performed for demonstration of fast acquisition and reacquisition capabilities under conditions of signal loss. The HiSGR showed good performance and it was stable during the simulations and tests, which proved its capability for future space applications.
The wide spectrum and propagation characteristics over the air give mmWave communication unique advantages as well as design challenges for 5G applications.To increase the system speed,capacity,and coverage,there is a need for innovation in the RF system architecture,circuit,antenna,and package in terms of implementation opportunities and constraints.The discuss mmWave spectrum characteristics,circuits,RF system architecture,and their implementation issues are discussed.Moreover,the transmitter key components,i.e.,the receiver,antenna,and packaging are reviewed.
A fully integrated 60-GHz transceiver for 802.11ad applications with superior performance in a 90-nm CMOS process versus prior arts is proposed and real based on a field-circuit co-design methodology.The reported transceiver monolithically integrates a receiver,transmitter,PLL(Phase-Locked Loop) synthesizer,and LO (Local Oscillator) path based on a sliding-IF architecture.The transceiver supports up to a 16QAM modulation scheme and a data rate of 6 Gbit/s per channel,with an EVM (Error Vector Magnitude) of lower than ?20 dB.The receiver path achieves a configurable conversion gain of 36~64 dB and a noise figure of 7.1 dB over 57~64 GHz,while consuming only 177 mW of power.The transmitter achieves a conversion gain of roughly 26 dB,with an output P1dBof 8 dBm and a saturated output power of over 10 dBm,consuming 252 mW of power from a 1.2-V supply.The LO path is composed of a 24-GHz PLL,doubler,and a divider chain,as well as an LO distribution network.In closed-loop operation mode,the PLL exhibits an integrated phase error of 3.3o rms (from 100 kHz to 100 MHz) over prescribed frequency bands,and a total power dissipation of only 26 mW.All measured results are rigorously loyal to the simulation.
Satellite networks have many advantages over traditional terrestrial networks.However, it is very difficult to design a satellite network with excellent performance.The paper briefly summarizes some existing satellite network routing technologies from the perspective of both single-layer and multilayer satellite constellations, and focuses on the main ideas, characteristics, and existing problems of these routing technologies.For single-layer satellite networks, two routing strategies are discussed, virtual node strategy and virtual topology strategy.Moreover, considering the deficiency of existing multilayer satellite network routing, we discuss the topic invulnerability.Finally, the challenges and problems faced by the satellite network are analyzed and the trend of future development is predicted.
Recent rapid developments in 4G wireless communication have been motivated by breakthroughs in air interface technology,exemplified by the replacement of WCDMA (Wideband Code Division Multiple Access) with OFDM (Orthogonal Frequency-division Multiplexing).Although the protocol to adopt for 5G HF (High-Frequency) wireless communication—including such matters as waveform,network deployment,and frequency range—has been a controversial issue for a number of years,a common view is that there is a large gap between the rapidly increasing requirements pertaining to traffic capacity and the capabilities of current LTE (Long Term Evolution) networks in terms of spectral and power efficiency.A number of technical challenges need to be overcome in order to bridge this gap.In this paper,by briefly reviewing progress in HF technology,we summarize technical challenges ranging from propagation attenuation and the implementation of circuit devices,to signal processing and the Ka-band to offer feasible reflection on the forthcoming technological revolution.
Modern mobile devices provide a wide variety of services.Users are able to access these services for many sensitive tasks relating to their everyday lives (e.g., finance, home, or contacts).However, these services also provide new attack surfaces to attackers.Many efforts have been devoted to protecting mobile users from privacy leakage.In this work, we study state-ofthe-art techniques for the detection and protection of privacy leakage and discuss the evolving trends of privacy research.
Cyber-physical systems are being confronted with an ever-increasing number of security threats from the complicated interactions and fusions between cyberspace and physical space.Integrating security-related activities into the early phases of the development life cycle is a monolithic and cost-effective solution for the development of security-critical cyber-physical systems.These activities often incorporate security mechanisms from different realms.We present a fine-grained design flow paradigm for security-critical and software-intensive cyber-physical systems.We provide a comprehensive survey on the domain-specific architectures, countermeasure techniques and security standards involved in the development life cycle of security-critical cyber-physical systems, and adapt these elements to the newly designed flow paradigm.Finally, we provide prospectives and future directions for improving the usability and security level of this design flow paradigm.
Abstract:Atmospheric ducts are horizontal layers that occur under certain weather conditions in the lower atmosphere.Radio signals guided in atmospheric ducts tend to experience less attenuation and spread much farther,i.e,hundreds of kilometers.In a large-scale deployed TD-LTE(Time Division Long Term Evolution) network,atmospheric ducts cause faraway downlink wireless signals to propagate beyond the designed protection distance and interfere with local uplink signals,thus resulting in a large outage probability.In this paper,we analyze the characteristics of ADI atmospheric duct interference (Atmospheric Duct Interference) by the use of real network-side big data from the current operated TD-LTE network owned by China Mobile.The analysis results yield the time varying and directional characteristics of ADI.In addition,we proposed an SVM (Support Vector Machine)-classifier based spacial prediction method of ADI by machine learning over combination of real network-side big data and real meteorological data.Furthermore,an implementation of ADMM (Alternating Direction Methods of Multipliers) framework is proposed to implement a distributed SVM prediction scheme,which reduces data exchange among different regions/cities,maintains similar prediction accuracy and is thus of a more practical use to operators.
To meet the ever growing traffic in mobile communication,mmWave (millimeter-Wave) frequency bands have gained considerable attention for having a greater amount of bandwidth available than the current cellular spectrum below 3 GHz.Several test systems have been reported on recently to validate the possibility of mmWave links in mobile scenarios.However,there still exist practical issues to enable the application of mmWave in mobile communication,including reliability and cost.In this article,we present some new designs that address these issues,where system architecture,transceiver architecture,and related issues such as circuits and antenna arrays are considered.Hypercellular architecture is applied in mmWave mobile networks to overcome blockage problems,and a Butler-matrix-based HBF (Hybrid Beamforming) architecture is considered in an mmWave link.Simulations and experimental results are presented to validate the effectiveness of the Butler-matrix-based system.
Abstract:Wireless channel modeling has always been one of the most fundamental highlights of the wireless communication research.The performance of new advanced models and technologies heavily depends on the accuracy of the wireless CSI (Channel State Information).This study examined the randomness of the wireless channel parameters based on the characteristics of the radio propagation environment.The diversity of the statistical properties of wireless channel parameters inspired us to introduce the concept of the tomographic channel model.With this model,the static part of the CSI can be extracted from the huge amount of existing CSI data of previous measurements,which can be defined as the wireless channel feature.In the proposed scheme for obtaining CSI with the tomographic channel model,the GMM(Gaussian Mixture Model)is applied to acquire the distribution of the wireless channel parameters,and the CNN(Convolutional Neural Network) is applied to automatically distinguish different wireless channels.The wireless channel feature information can be stored o?ine to guide the design of pilot symbols and save pilot resources.The numerical results based on actual measurements demonstrated the clear diversity of the statistical properties of wireless channel parameters and that the proposed scheme can extract the wireless channel feature automatically with fewer pilot resources.Thus,computing and storage resources can be exchanged for the finite and precious spectrum resource.
Cloud-based video communication and networking has emerged as a promising new research paradigm to significantly improve the quality of experience for video consumers.An architectural overview of this promising research area was presented.This overview with an end-to-end partition of the cloud-based video system into major blocks with respect to their locations from the center of the cloud to the edge of the cloud was started.Following this partition, existing research efforts on how the principles of cloud computing can provide unprecedented support to 1) video servers, 2) content delivery networks, and 3) edge networks within the global cloud video ecosystems were examined.Moreover, a case study was exemplfied on an edge cloud assisted HTTP adaptive video streaming to demonstrate the effectiveness of cloud computing support.Finally, by envisioning a list of future research topics in cloud-based video communication and networking a coclusion is made.
Abstract:Future communication systems will include different types of messages requiring different transmission rates,packet lengths,and service qualities.We address the power-optimization issues of communication systems conveying multiple message types based on finite-delay information theory.Given both the normalized transmission rate and the packet length of a system,the actual residual decoding error rate is a function of the transmission power.We propose a generalized power allocation framework for multiple message types.Two different optimization cost functions are adopted:the number of service-quality violations encountered and the sum log ratio of the residual decoding error rate.We provide the optimal analytical solution for the former cost function and a heuristic solution based on a genetic algorithm for the latter one.Finally,the performance of the proposed solutions are evaluated numerically.
Mobile devices such as smartphones and tablets have continued to grow in recent years.Nowadays, people rely on these ubiquitous smart devices and carry them everywhere in their daily lives.Acoustic signal, as a simple and prevalent transmitting vector for end-to-end communication, shows unique characteristics comparing with another popular communication method, i.e., optical signal, especially on the applications performed over smart devices.Acoustic signal does not require lineof-sight when transmission, the computational power of most smart devices are sufficient to modulate/demodulate acoustic signal using software acoustic modem only, which can be easily deployed on current off-the-shelf smart devices.Therefore, many acoustics-based short range communication systems have been developed and are used in sensitive applications such as building access control and mobile payment system.However, past work shows that an acoustic eavesdropper snooping on the communication between a transmitter and its legitimate receiver can easily break their communication protocol and decode the transmitted information.To solve this problem, many solutions have been proposed to protect the acoustic signal against eavesdroppers.In this overview, we explore the designs of existing solutions, the corresponding implementations, and their methodologies to protect acoustic signal communication.For each dependable and secure acoustics-based short range communication system, we present the major technical hurdles to be overcome, the state-of-the-art, and also offer a vision of the future research issues on this promising technology.
The large bandwidth available with mmWave (millimeter Wave) makes it a promising candidate for 5th generation cellular networks.Proper channel estimation algorithms must be developed to enable beamforming in mmWave systems.In this paper,we propose an adaptive channel estimation algorithm that exploits the poor scattering nature of the mmWave channel and adjusts the training overhead adaptively with the change of channel quality for mmWave cellular systems.First,we use a short training sequence to estimate the channel parameters based on the two-dimensional discrete Fourier transform method.Then,we design a feedback scheme to adjust the length of the training sequence under the premise of ensuring the accuracy of the channel estimation.The key threshold in the feedback scheme is derived and its influence on the accuracy of the estimation results is analyzed.Simulation results confirm that the proposed algorithm can adjust the length of the training sequence adaptively according to the current channel condition maintaining a stable estimation accuracy.
Test results for a 10-Gbps prototype demonstrator working at 71~76 GHz frequency band with a 2-bit/s/Hz spectral efficiency are reported.To overcome the speed limitation of the commercial DA/ADs,a two-channel analog IF multiplexing and demultiplexing topology is adopted as a trade-off between cost and spectrum efficiency.The same approach is also used to achieve up to 20 Gbps with a full 10-GHz bandwidth of the allocated commercial bands (71~76 GHz and 81~86 GHz).
Abstract:Wireless big data is attracting extensive attention from operators,vendors and academia,which provides new freedoms in improving the performance from various levels of wireless networks.One possible way to leverage big data analysis is predictive resource allocation,which has been reported to increase spectrum and energy resource utilization efficiency with the predicted user behavior including user mobility.However,few works address how the traffic load prediction can be exploited to optimize the data-driven radio access.We show how to translate the predicted traffic load into the essential information used for resource optimization by taking energy-saving transmission for non-real-time user as an example.By formulating and solving an energy minimizing resource allocation problem with future instantaneous bandwidth information,we not only provide a performance upper bound,but also reveal that only two key parameters are related to the future information.By exploiting the residual bandwidth probability derived from the traffic volume prediction,the two parameters can be estimated accurately when the transmission delay allowed by the user is large,and the closed-form solution of global optimal resource allocation can be obtained when the delay approaches infinity.We provide a heuristic resource allocation policy to guarantee a target transmission completion probability when the delay is no-so-large.Simulation results validate our analysis,show remarkable energy-saving gain of the proposed predictive policy over non-predictive policies,and illustrate that the time granularity in predicting traffic load should be identical to the delay allowed by the user.
Vehicular network communication technology is currently attracting a considerable amount of attention.We consider a scenario in which vehicular communication nodes share the same spectrum resources and generate interference with other nodes.Compared with traditional interference-avoiding vehicular communications,this paper aims to increase the number of accessed communication links under the premise of satisfying the required QoS.In our research,communication nodes have opportunities to select relay nodes to both help improve their data transmissions and reduce their transmit power in order to decrease interference with other links while still satisfying their QoS requirements.Based on these objectives,we propose an innovative interference management method that considers link selection,power adaption,and communication mode selection simultaneously to maximize the number of communication links with the lowest power cost.Compared with traditional link-selection and power-adaption interference management schemes,the proposed scheme improves QoS satisfaction with high energy efficiency.Simulation results demonstrate both the efficiency and the effectiveness of the proposed scheme.