Tutorial  Schedule


Morning Tutorials: All are Half-Day

1. Cloud RAN Overview
2. Towards 5G: LTE, LTE-Advanced, and Beyond
3. Signal Processing for Millimeter Wave Wireless Communications
4. A Hands-on Guide to US Spectrum Policy and Regulations for Wireless Innovator
5. IEEE 802.11ax: High Efficiency Wireless LAN (HEW)
6. Molecular Communication: System Models, Fundamental Limits, and Experimental Implementations
7. Hacking Network Coding for 5G Systems

Afternoon Tutorials:

8. Massive MIMO: Theory and Practice
9. Analysis, design and optimization of Network Slicing Solutions in 5G Wireless Networks
10. 5G Wireless Systems in Unlicensed Spectrum
11. Design and Test Challenges at Microwave and Millimeter Wave Frequencies

Cloud RAN Overview

Presenters: Rajeev Agrawal, Anand Bedekar, and Suresh Kalyanasundaram (Nokia, USA)
Cloud Radio Access Networks (Cloud RAN) is an emerging architectural paradigm in mobile networks for both 4G as well as 5G. In conventional mobile networks, RAN baseband functionality is typically deployed at the cell sites on special-purpose hardware. In the Cloud RAN architecture, all or parts of the baseband functions are moved deeper in the network at more centralized locations, and hosted on general-purpose server hardware using virtualization technologies. Cloud RAN attempts to achieve operational efficiencies and better total-cost-of-ownership through centralization of baseband functions, pooling efficiencies for RAN baseband processing, and air-interface performance gains by fast-time-scale multi-cell coordination. In this tutorial, we will present an overall look at deployment scenarios, potential benefits, key technical challenges in the evolution of the Radio Access Network (RAN) architecture towards Cloud RAN, and solutions to overcome them. We aim to present insights and architectural principles underlying key technologies and tradeoffs that drive Cloud RAN.
A key challenge in deploying Cloud RAN is the need for high-bandwidth/low-latency transport between the central sites and cell sites, known as fronthaul. To address fronthaul limitations, we examine the implications and tradeoffs enabled by RAN functional splits on fronthaul needs, system performance, and centralization scale. We present an analysis of the impact of Cloud RAN architectures and fronthaul on performance gains achievable by multi-cell coordination, and the implications of the architecture of multi-cell coordination algorithms on deployment flexibility in a Cloud RAN environment. To maximize the use of General-Purpose Processors (GPP) and operating systems such as Linux for Cloud RAN, we examine the implications of the need to achieve real-time performance for RAN functions.  To enable right-sizing the amount of compute used for various RAN functions based on the workload, we examine the principles underlying pooling and scalability for RAN functions. Cloud RAN also aims to use cloud management technologies such as virtualized infrastructure management (VIM) and orchestration for automating the instantiation and scaling of RAN functions. We look at the special needs for RAN arising from real-time constraints and a mix of GPP and non-GPP hardware. In the evolution towards 5G, we propose the use of Cloud-RAN-based multi-connectivity anchoring to address processing bottlenecks in a scalable manner. The emergence the Distributed Edge Cloud that hosts the Cloud RAN also enables a broader architectural examination of what functions may benefit from being closer to the network edge. We identify opportunities for optimization across RAN and other network layers enabled by the Distributed Edge Cloud architecture.

Towards 5G: LTE, LTE-Advanced, and Beyond

Presenter: Hyung Myung, Qualcomm
Long Term Evolution (LTE) developed by 3GPP has become the global 4th generation (4G) standard. 3GPP recently started to investigate 5G standard in Release 14. In this tutorial, we first survey the underlying techniques of 4G and 5G such as OFDMA, SC-FDMA, MIMO/Massive MIMO, fast multi-carrier resource scheduling, and millimeter wave (mmWave) radio access. Then, we give technical overview of LTE and LTE-Advanced. We also survey upcoming 5G system design and timeline of 5G standardization within 3GPP. The audience will learn about key technologies of 4G & candidate 5G communication systems and will obtain detailed understanding of LTE, LTE-Advanced, and candidate 5G systems.

Signal Processing for Millimeter Wave Wireless Communications     

Presenters: Robert Heath, University of Texas, Austin, USA and Nuria González-Prelcic, Universidad de Vigo, Spain

Communication at millimeter wave (mmWave) frequencies is defining a new era of wireless communication. The mmWave band offers much higher bandwidth communication channels than presently used in commercial wireless systems. Wireless local area networks are already exploiting the 60 GHz mmWave band, while 5G cellular systems are likely to operate at other mmWave frequencies. Because of the large antenna arrays, different channel models, and new hardware constraints, signal processing is different in mmWave communication systems. This tutorial will provide an overview of mmWave wireless communication from a signal processing perspective. Topics covered include propagation models and the presence of sparsity in the channel, power consumption and resulting hardware constraints, MIMO techniques in mmWave including beam training, hybrid beamforming, MIMO with low-resolution analog-to-digital converters, and channel estimation. Millimeter wave communication is a topic of extreme interest right now in the signal processing and communication theory communities. We also note it is a significant area of interest for the US Government, with the FCC just releasing a notice of inquiry for using mmWave spectrum for mobile communication and suggesting potential spectrum. This tutorial opens the door to future applications of mmWave to cellular, transportation, massive MIMO, and wearables, reviewing as well current applications in WLAN. We believe that our tutorial is very timely given the growing interest in mmWave for cellular communication in particular.

TUTxx: A Hands-on Guide to US Spectrum Policy and Regulations for Wireless Innovators

Presenters: Michael Marcus, FCC, USA (retired) and Virginia Tech, USA, Anne Linton Cortez, Washington Federal Strategies, USA
Around the world spectrum technologies are regulated much more than most other technologies in the IEEE community. In particular, innovative technologies often need non-routine regulatory approvals.  Ignoring those regulatory approvals that can severely delay or even block market access. Such technologies might involve new bands or novel ways of sharing spectrum on a non-interfering basis with existing users. This tutorial will explain the basics of international and US spectrum policy so that innovators can identify any serious relevant regulatory issues early. It will also explain the various routine and nonroutine approvals that might be needed such as experimental licenses, equipment authorization, waivers, service rules, and commenting on FCC proposals. Possible tactics for impacting policy will be discussed. Bands discussed will range from VHF to the WRC-19 proposal that goes to 450 GHz.

IEEE 802.11ax:  High Efficiency Wireless LAN (HEW)

Presenters: Osama Aboul-Magd, Huawei Technologies, Canada, and Chair of IEEE 802.11ax, Edward Au, Huawei Technologies, Canada, and Chair of IEEE 802.11ay

In the recent years there has been an increased dependence on Wi-Fi technology as the main tool for accessing the Internet.  Several factors have contributed to this trend.  In addition to the ubiquitous availability of Wi-Fi interfaces on mobile devices and the ease of use of the technology, the most prominent factor is the almost free availability of Wi-Fi connectivity in coffee shops, hotels, convention centers, etc.  The increased use of Wi-Fi technology has manifested itself in a phenomenal increase of traffic crossing Wi-Fi facilities driven mainly by growth in video traffic.  Further, the traditional environments (use cases) where Wi-Fi is deployed have also changed.  WLAN deployments have migrated from its traditional markets in enterprise and consumer electronics to carrier and service providers deployments for data offloading and deployments that are characterized by large number of users and large number of devices (access points) in a closed and limited geographical area such as airports and sports events taking place in public stadiums, i.e., dense deployments.
To meet the new challenges a further increase in the supported data rates may be difficult to achieve due to technology limitations and may not be very helpful.  In the year 2013 the IEEE 802.11 Working Group embarked on a new project to improve Wi-Fi users’ experience and deal with the dense deployment scenarios.  The name of the project is high efficiency WLAN (HEW) and is also known as IEEE 802.11ax.  The scope of this new project deviates from the scopes of previous projects, e.g. IEEE 802.11n and IEEE 802.11ac, in that it focuses on the improvements of the per-user throughput rather than the aggregated link throughput. 
This tutorial provides an overview of the work progressing at the IEEE 802.11 related to high efficiency WLAN (HEW) or IEEE 802.11ax amendment.  The IEEE 802.11ax is the next in the Wi-Fi standard series after the successful deployments of IEEE 802.11n and IEEE 802.11ac.  IEEE 802.11ax is expected to introduce new features to the WI-Fi industry such as OFDMA and UL MU MIMO.  In particular a new OFDMA PHY layer is introduced together with the supporting MAC features.

Molecular Communication: System Models, Fundamental Limits, and Experimental Implementations

Presenters: Nariman Farsad, Stanford Univ., and Chris Rose, Brown Univ.

This tutorial introduces the emerging field of molecular communication wherein chemical signals are used to connect "tiny" machines such as synthetic biological devices and swarms of micro-scale robots. We begin by presenting some of the recent advances in system biology, nanotechnology, and bioengineering which have led to the creation of many different tiny machines in a laboratory setting. Such devices could find application in in-body communication, data storage, and infrastructure monitoring in smart cities/industrial complexes and sensor networks for homeland security. Practical deployment of these devices is only possible if they can communicate and collaborate, but the medium at these size scales is often hostile to more standard electromagnetic and acoustic forms of communication. Molecular communication is thus proposed as an attractive solution. Next, we discuss some of the different molecular communication system models developed over the past decade, all of which have three basic components: the Transmitter, the Propagation Channel, and the Receiver. We start from the transmitter and present different schemes by which information can be delivered by chemical signals. Then, different propagation mechanisms such as flow, active transport and various forms of random walks are presented. Receiver models, such as ligand receptors, are introduced and optimal detection algorithms discussed. We then consider fundamental capacity limits of molecular timing/concentration/payload-encoded molecular channels. The tutorial concludes with a discussion of the recent experimental implementations of molecular communication, and some of the most important open problems in this exciting new area.

Hacking Network Coding for 5G Systems

Presenters: Frank Fitzek, Sreekrishna Pandi, Juan Cabrera (Technical University of Dresden)

5G communication systems are just around the corner. But the new technical requirements in latency, throughput, security, and resilience together with new architectures such as multi path, mesh, or multi hop, will request for new technologies. One of those new technologies is Network Coding, which has raised a lot of interest in the research community lately and first attempts in standardization bodies are taking place to integrate this ground breaking technology in commercial products. This tutorial will give a short introduction to network coding with respect to 5G, but the main focus is to enable the audience to implement their own ideas either in simulations or in real testbeds. Therefore, the tutorial organizers will present their own software library for network coding. The software library comes with a small simulation environment to test out first simple relaying topologies. The tutorial will show how to embed the software library and to do the parameterization for different scenarios. Understanding the impact of different parameter choices are of critical importance in order to successfully deploy network coding in real networks and on real devices. Throughout the tutorial participants will gain hands-on experience with the impact of key parameters such as finite field size, generation size and systematic coding. The tutorial will also show how to implement the software on commercial platforms. Some demonstrators of network coding will be available showing the full potential of network coding in larger testbeds.
The goal of the tutorial is that each participant understands the basic functionality of network coding and is able to integrate network coding in own projects.

Massive MIMO: Theory and Practice

Presenters: Thomas L. Marzetta, Nokia, and Ove Edfors, Lund University
Massive MIMO is emerging as the most compelling fifth generation wireless technology. Perhaps the ultimate embodiment of Multiple-Input Multiple-Output communications, Massive MIMO utilizes a large number of individually controlled, physically small, low power antennas to create parallel virtual circuits over the full spectrum between the base station and a multiplicity of single antenna users. Area spectral efficiency (bits/second/Hertz/square-kilometer) improvements over 4G technologies may range from ten to one-thousand, depending on the mobility of the terminals. Other benefits include energy efficiency (bits/Joule) gains in excess of one-thousand, and simple and effective power control that yields uniformly great service throughout the cell. Crucial to the scalability of Massive MIMO is its reliance on directly measured - rather than assumed - channel characteristics. The large number of service antennas, and the resulting channel hardening, makes the analysis and control of multi-cellular Massive MIMO systems surprisingly straightforward. Tractable capacity lower bounds account for receiver noise, channel estimation error, the overhead associated with pilot signals, power control, imperfections of the multiplexing or de-multiplexing signal processing, non-coherent inter-cell interference, and coherent inter-cell interference due to pilot contamination. In parallel with theoretical developments, experiments have validated propagation models that are favorable to the function of Massive MIMO, and Massive MIMO test-beds are demonstrating the fundamental soundness of the concept. This tutorial provides the participants with a thorough comprehension of the fundamentals of Massive MIMO, as well as an understanding of how practical Massive MIMO systems functions. In addition, the participants will learn to discern the distinctions between a genuine Massive MIMO system, and MIMO systems that merely purport to be Massive MIMO.

Network Slicing Solutions: Analysis, Design and Optimization in 5G Wireless Networks

Presenters: Marco Di Renzo, Paris-Saclay University, Konstantinos Samdanis, Huawei EUROPEAN Research Center, Vincenzo Sciancalepore, NEC Europe Ltd., Fabrizio Granelli, University of Trento

The goal of this tutorial is to provide a comprehensive overview of network virtualization and network slicing operations through several standard definition activities carried out in the last decade. This tutorial sheds light on network slicing feasibility in the next generation mobile networks by boiling down the overhead and complexity of fully virtualized network deployments. The tutorial analyzes the state-of- the-art solutions delivering the first example of network slicing while highlighting the hardware limitations of the current solutions and the real potentiality of advanced virtualization approaches. The tutorial also provides the audience with a solid background and comprehensive description of stochastic geometry modeling by introducing key theorems, by explaining how to formulate problems from the standpoint of system-level analysis and optimization, as well as by illustrating how to use stochastic geometry for modeling and analyzing cellular networks based on the novel concept of multi-tenancy network sharing. Finally, this tutorial points out the future research directions to embrace new open-source function/resource allocation procedures in a multi-tenant virtualized network scenario.

5G Wireless Systems in Unlicensed Spectrum: Design Principles and Challenges

Presenter: Amitav Mukherjee, Ericsson Research, USA
Upcoming 5G wireless systems are being designed to operate across a vast swath of frequency bands, spanning licensed, shared, and unlicensed spectrum. Operation in unlicensed and shared spectrum creates considerable challenges due to uncertainty in channel access and coexistence with other technologies, which give rise to new research opportunities. This can be seen from the intense scrutiny of 5 GHz unlicensed-band technologies such as LTE-U and Licensed-Assisted Access (LAA) that need to coexist with Wi-Fi, for example. 5G systems will take this one step further by operating in unlicensed spectrum ranging from sub-1 GHz bands to millimeter-wave bands above 60 GHz. This raises a multitude of questions such as: how should unlicensed-band 5G IoT systems be designed for wide-area coverage? What kind of multi-antenna beamforming strategies are suitable for mmwave unlicensed spectrum? How will 5G coexist with other radio access technologies in unlicensed spectrum?
In order to answer the above questions, this tutorial therefore aims to provide a comprehensive overview of the state-of-the-art in 5G wireless systems design in unlicensed spectrum, including both broadband and IoT networks. We will visit the pertinent regulatory requirements, research challenges, a wide array of coexistence evaluations, on-going standardization and implementation efforts, and applications of enabling 5G technologies in unlicensed spectrum, with an emphasis on PHY/MAC design aspects.

Design and Test Challenges at Microwave and Millimeter Wave Frequencies

Presenters: Sang-kyo Shin, Keysight Technologies, Khouzema Unchwaniwala, Keysight Technologies, Greg Jue, Keysight Technologies
One of the most promising potential emerging wireless technologies for fifth-generation (5G) cellular is the use of large blocks of contiguous spectrum in the microwave and millimeter-wave (mmWave) frequency bands. In the US, the FCC recently opened 3.8GHz of licensed spectrum and 14 GHz of contiguous unlicensed spectrum, creating vast new possibilities for 5G applications using wide bandwidth digital modulation.  At the same time, the understanding of broadband signal impairments device packaging, and antenna integration at mmWave frequencies is still in its infancy.  Issues such as broadband noise, phase noise, linearity, frequency response limit the attainable EVM and link budgets, and shorter wavelengths require much tighter mechanical tolerances. A different approach to generating and analyzing signals (> 2 GHz bandwidth) will be required to meet very wideband requirements allowed by these new spectrum allocations. Simulation and modeling also play a bigger role in investigating these new technologies, as well as to provide new design and test methodologies for systems and circuit designs. All of these challenging issues will be explored in this tutorial, along with new Design & Test approaches to address them, in order to explore the vast possibilities of the mm-wave frequency frontier.