Offers in cloud computing, network, machine learning, artificial intelligence, machine learning

The Synchromedia Lab (http://www.synchromedia.ca) at the École de Technologie Supérieure (Université du Québec)  is seeking for talent MSc and PhD candidates with a specialty in various fields of cloud computing, data analysis, machine learning, big data, artificial intelligence (AI), Internet of Things (IoT), telecommunications, software-defined networks (SDN), network function virtualization (NFV), environment, etc.

In addition to the projects listed below, we are particularly interested in candidates with skills in networking, cloud computing and data analysis in general. Please send us an email with your profile and research purpose.

 

Applications shall include a CV, transcripts, and a cover letter. Selected applicants will be contacted for a screening view.

Contact: kim-khoa.nguyen@etsmtl.ca

 

 

  

 

AI-based engine for real-time data analytics in smart buildings


Today, smart technologies are increasingly used in smart buildings to reduce energy consumption, as well as to improve comfort and users’ satisfaction. This is a challenging goal due to the heterogeneity of buildings and its components, as well as the changing condition of external environment. Real-time and continuous optimization techniques are therefore required, which rely heavily in predictive parameters of building’s elements. Forecasting building behaviors, like temperature, humidity, noise, light intensity, and energy consumption, requires a complex mathematical model which is governed by complex physical and behavioural phenomena. It is affected by a multitude of parameters, which could be classified into three groups: outdoor conditions, building characteristics, and occupants’ behavior. Various sensing techniques using different communications and calibration methods add new challenges to data processing, decision-making, as well as increased management costs. In our prior work, we have developed an optimization framework that minimizes the total energy consumption of geographically distributed buildings. Integrating forecasting data into such optimization framework is challenging, because data has to be modelled accurately. Using state-of-the-art machine learning algorithms to model smart building behaviors has been proposed recently. However, these prior works have focused mainly on simulation environments or small-scale buildings. Little attention has been paid to industrial buildings which generate an incredible quantity of data. This data, once collected, is so vast that it is extremely hard to analyses it with traditional data analytics tools.

Facing this situation, and to generate value with this large data set, our goal is to build machine learning spiders to look for situations with known patterns that can be expressed as variables in complex AI algorithms. These spiders will collect the necessary data, grabbing all the points involved from a HVAC sub-system, a zone, multiple zones, the whole building, or the whole campus levels. These algorithms will eventually help generate optimized control policies in real-time and could replace the building engineers in all day to day tasks.

 up

 

Software-defined networking for IoT services in smart community


The introduction of massively new Internet of Things (IoT) applications, in particular those with low-latency requirements, has resulted in localvore demand. Edge computing is an emerging approach dealing with such increasing demand. It enables computing tasks to be performed at the last-mile close to end users, instead of a remote cloud. To fully achieve edge computing, traditional Central Office (CO) may be transformed into a small-scale data center which is responsible for both network provisioning and computing tasks. Such computing paradigm offers many features necessary to enable M2M services in the future embedded Internet. Home networks are also rapidly developing to include a large diversity of devices, including mobile phones, personal computers, laptops, TVs, speakers, lights, and electronic appliances.

This project will be investigating software-defined solutions for converging optical and wireless access networks in smart community network, supporting the development of next-generation IoT applications. The candidate will evaluate of existing architecture, and then design a scalable, robust and reliable software-defined home and community network. He/she will propose approaches and develop algorithm to virtualize network, and provide QoS for IoT services based on bandwidth sharing and optimization, as well as to define a fault-tolerant architecture for virtual network and virtual CPE. The project also involves research on resource discovery, network QoS, resource allocation, and virtual resource grouping and sharing.

up

Data analytics platforms for smart city applications


Worldwide cities are currently under quick transition towards a low carbon environment, high quality of living, and resource efficient economy. Data from various sources (e.g. Internet-of-Things, smart systems, people) and context (e.g. water, energy, traffic, and buildings) are considered to be the most valuable asset of a smart city. However, the heterogeneity of data makes it difficult to interpret, combine, analyze, and consume. Artificial intelligence and big data analytics are valuable techniques to be explored for addressing this challenge. This project aims at investigating data analytics and machine learning methods to strengthen urban planning and knowledge sharing between cities and stakeholders. It includes the design and development of an interconnected knowledge platform to guide and improve robust decision making on future urban development. Since it is impossible for a fixed data-driven model to handle ever-changing smart city data process behavior, the project will also develop new online machine learning algorithms which are able to distinguish whether the current scenario has been incorporated during past learning or whether additional information is embedded in new data.

In order to develop such a big data analysis system for smart cities, many research issues must be addressed in the project, for example:

-        How to efficiently combine multiple real-time data sources in the cities for further processing?

 

-        How to extract useful features from high-dimensional datasets?

-        How to actively learn the data pattern with (mostly) unlabeled and continuous changing data?  


-     How to design distributed machine learning algorithms which can be run in different processing elements on a smart city infrastructure?

 

up

Life-cycle analysis (LCA) and environmental optimization in virtual environment


Internet services have undergone multiple revolutions over the past 50 years, from snail-mail to household ultra-broadband utility. Today, the tremendous demands of the growing human society are pushing the evolution of Internet-based automation technologies, particularly for knowledge dissemination and intensive information processing. Data centres, regarded as the “brain” of the Internet, are required to host increasingly critical applications, thereby consuming a huge amount of energy. This raises new questions, in terms of scalability, resiliency and operational cost-effectiveness, and environmental concerns in particular.

The Canada Research Chair Tier 1 project at the Synchromedia Lab is addressing issues related to a sustainable smart cloud platform, which is a virtual and analytical system capable of deep, complex computation and intelligent behaviours performed in an energy-efficient and eco-friendly way. By means of smart meters, data collectors, and analytical gears that acquire knowledge about the ecosystem and all the actors involved, including end-users, eco-cloud services are able to react immediately and “just in time”, to establish automated control processes.

This project is aimed at developing a framework to analyze and optimize environmental emissions of the cloud ecosystem. It includes models and techniques to calculate and/or estimate environmental impacts of virtual resources, components, services, and systems in real-time and long term utilization, at low granularity levels. The candidate is required to apply modeling techniques to represent the virtual environment of cloud in a complete, accurate, and understandable fashion. Mathematical and optimization models will then be used to optimize cloud operation process to minimize the overall environmental impacts while meeting quality of service requirements.

up

Statistical models for traffic in virtual WAN


Today, network operators are increasingly managing their pools of network elements and long-haul connections with the help of WAN controllers, like OSCARS, OpenDRAC, Argia, and v-WAN. With an increase in traffic demand and heterogeneity of network architecture, these WAN controllers are facing challenges of interoperation & analytics. Therefore, analytical models adding to such inter-datacenter tools are required to help carriers planning ahead how to deploy their on-demand circuit without excessive over-provisioning.

This project will address traffic models for various types of applications in inter- data center network environment, especially with respect to Telco and multimedia applications. These models will be used to develop algorithms and methodologies to optimize resource consumption and environmental impacts of the core network, which is key element of an inter- data center orchestrator.

up

Interoperable interfaces for inter-cloud environment


Today's data centres, regarded as the “brain” of the Internet, are required to host increasingly critical applications, thereby consuming a huge amount of energy. This raises new questions, in terms of scalability, resiliency and operational cost-effectiveness, and environmental concerns in particular. The Canada Research Chair Tier 1 project at the Synchromedia Lab is addressing issues related to a sustainable smart cloud platform, which is a virtual and analytical system capable of deep, complex computation and intelligent behaviours performed in an energy-efficient and eco-friendly way.

A key issue when building a cloud-based ecosystem spanning multiple infrastructures is interoperability. Existing cloud computing solutions have not been built with interoperability in mind. They usually lock customers into a single cloud infrastructure, platform or service, preventing the portability of data or software created by them.

This project is therefore, defined to build interfaces to import/export operational data between different cloud infrastructures, like different OpenStack domains, or between an OpenStack and Amazon AWS domain. The interfaces will also provide additional control functions to manage virtual resources between different cloud infrastructures.

up

Privacy communications for IoT services


Recently, the centralized smart home control have resulted in a new class of applications which require individuals to contribute their private data in order to amass, store, manipulate and analyze information for decision-making purposes. This is enabled thanks to networked smart objects that monitor, and report various types of data, such as energy consumption, temperature, quality of air, etc. On one hand, this fine grained information enables trending, forecasting and fault detection analysis, which leads to a more efficient and robust control and management system; on the other hand, this information reveals important privacy information of human actors.

Recently, a strong notion of privacy, namely, differential privacy has emerged, which provides some privacy guarantees against adversaries with arbitrary side information. Differential privacy aims at limiting the risk enhancement to one’s privacy when he contributes his data to a statistical database by guaranteeing that, even if the sender removes his data from the data set, the released results would not likely become significantly more or less. However, strong privacy guarantees may have negative impacts on application performance.

In the same time, the software-defined networking (SDN) paradigm which separates the control plane from the data plane has emerged as potential solution dealing with the complexity of IoT networks. In particular, the ability of implementing new algorithms to dynamically handle packets in OpenFlow-enabled network elements to achieve specific administrative goals makes it possible to develop new real-time security and privacy mechanisms from in-network perspectives.

This research is aimed at building a new framework for real-time differential privacy from signal processing perspective, and implementing a novel method for providing real-time privacy preserving using SDN technology which flexibly and optimally adds noises to network flows. They are substantial for the development of carrier-grade smart home, Machine-to-Machine (M2M), IoT, and Telco Cloud technologies.

up

Software-Defined Data Center Fabrics


This project will build on Software Defined Networking (SDN) technology to allow data center network fabrics to be specified using target independent domain specific programming languages. The project will investigate innovative approaches that enable advances in data-plane programming and network-wide policy specification in next-generation data centers.

Recently, a new concept of Programming Protocol-Independent Packet Processors has emerged in a wide-range of applications to a variety of networking devices. In particular, domain specific languages such as P4 have the ability to transform the networking industry by allowing network operators to deploy new network features across a number for network devices in a multi-vendor environment. There is still a wide gap between the concept of a generalized language for packet processing and the ability to deploy arbitrary programs across programmable packet processing targets such as general purpose CPUS, Network Processors (NPUs), FPGAs and programmable ASICs. This project will develop new models and algorithms to address a next-generation programmable forwarding plane of network elements, based on P4 programming language, with respect to new virtualized and physical network functions.

In addition, while the ability to deliver SDN is rooted in the ability to program individual devices, network configuration remains a complex process. This project will therefore develop new techniques that optimize individual device function based on overall network configuration. In reality, Network Function Virtualization (NFV) enables feature deployment agility by moving the data plane from hardware appliances to virtual machines. Virtualization techniques enable server consolidation and the ability to elastically scale capability through the use of commercial off-the shelf hardware (COTS). The outcome of this project will help expand the capability of carrier grade servers to respond to the ever-increasing volume of network traffic.

up

Fog-and-cloud service model for IoT


Today, IoT is becoming a global system of connected sensors, actuators, networks, machines and devices. IoT and Cloud integration will enable development of large-scale IoT applications, such as smart cities, energy, health, etc. Moreover, due to requirements such as mobility support, location-awareness and low latency, the cloud has been recently extended to the edge of the network—Fog Computing. Developing large-scale IoT applications using cloud and fog computing resources is challenging because it requires a service abstraction model that matches highly dynamic and heterogeneous resources at different levels of the network hierarchy from IoT devices to fog devices and the cloud. This project is dedicated to to develop knowledge, software service design concepts and mechanisms for scalable and dynamic integration of IoT devices and their services into fog and cloud platforms. It will devise solutions for:

 

  • Modeling, developing, and integrating IoT services in dynamic Fog-Cloud computing systems
  • Managing and adapting these services with respect to dynamicity of IoT devices and the dynamic availability of fog resources

 

up

Optimized architecture for IoT applications


Applications for the Internet of Things (IoT) are interesting and their design is challenging since they have to fulfil several requirements. These requirements are tightly connected with each other. Of course, applications need to be functionally correct. In addition, they need to use computing resources efficiently, since there are often parts of the application that run on battery-driven sensor nodes. At the same time, applications need to be secure, and for instance protect the privacy of user data. The communication in the network needs to be scalable, and its architecture should be robust. There may also be challenges regarding user-friendliness, because of the constrained user interfaces that these systems often have. Altogether, this requires a robust, modular, optimized, and highly available architecture, and often hinders the creation of such systems in a cost-effective way. Approaches to reduce complexity, such as the provision of frameworks or middleware, as applied to other areas, may not be possible due to the resource constraints or due to the tight dependencies of technologies that also evolve quickly.

The research thus focuses on the overall challenges when developing IoT applications, satisfying several requirements. The work also covers architectures for IoT systems, identifying where specific functionality should be execute and how different architectures influences security and robustness of the system. After a study of the problem domain and ongoing research, the work should identify and advance viable techniques, strategies and architectures that ease application development for IoT systems. This can include formal techniques, model-driven approaches like code generation, and other analytical methods that can assist IoT application developers.

up

AI-based controller for data center fabric


In recent years, there has been an emerging demand for improving the quality of high-bandwidth web services to users, driven by the arrival of the Internet of Things (IoT) and cloud-based applications. To afford this new demand, Internet Service Providers (ISPs) have to deploy their services closer end-users, creating a new category of data center service providers whose facilities literally extend further from the traditional internet hubs. 

This project will address emerging challenges faced by next-generation cloud fabric controller, which will result in a framework that analyzes workload and traffic in real-time, improves virtualization performance data centers, and optimally orchestrates data center facility, in view of achieving the efficiency of infrastructure provisioning, minimizing resource consumption, and guaranteeing QoE/QoS. Such framework will include novel methodologies and algorithms that optimize the performance of data center’s controller, and support new types of data center enabling technologies, like service function chaining, low latency, and zero loss rate. This project will result in a new model providing services for a class of cloud-based applications, like e-health, smart utility, smart transport, smart home, smart city, etc. The main objectives of the project are:

Proposing new models to characterize and predict traffic and workload in data centers based on advanced statistical and profiling mechanisms.
Developing awareness methodologies and artificial intelligence (AI) algorithms to optimally consolidate, share, and allocate resources in data centers, in terms of physical compute servers, network bandwidth to meet QoE/QoS requirements of a class of cloud-based applications.
Leveraging on software-defined networking (SDN) and network function virtualization (NFV) technologies to support multipath routing, multicasting, dynamic scheduling and planning of workload and traffic in real-time on virtual resources in data center. 

 

up

ericssonlogo
inocybelogo
canalogo
cienalogo
Civimetrix Telecom logo
mitacslogo
risq logo
nserclogo
promptlogo
ecolepolytechniquelogo
University of Torontologo
frqntlogo
uqlogo
MDEIE logo
cfilogo
ciraiglogo