Internet of Things
As we move away from fossil fuels toward renewable energy sources such as solar and wind, inexpensive energy storage technologies are required. This is so since renewable energy sources, such as solar and wind, are intermittent. An alternative to batteries – which are quite expensive – is “smart loads”, such as air conditioners equipped with computation and communication capability. With appropriate software, the power consumption of air conditioning — and many other loads — can be varied around a baseline. This variation is analogous to the charging and discharging of a battery. Loads equipped with such intelligence have the potential to provide a vast and inexpensive source of energy storage. Two principal challenges in creating a reliable virtual battery from millions of consumer loads include (1) maintaining consumers’ Quality of Service (QoS) within strict bounds, and (2) coordinating the actions of loads with minimal communication to ensure accurate reference tracking by the aggregate.
When the loads in question are residential loads that can either turned on or off, the coordination problem suffers from a combinatorial explosion. This talk describes our work in addressing this challenge by using randomized control, in which control actions are decided probabilistically. A key advantage of this approach is that aggregate behavior of a collection of loads can be approximated by an LTI (linear time invariant) system. Two classes of on/off loads will be considered: deferrable loads, such as water pumps, and thermostatically controlled loads (TCLs) such as air conditioners. The latter is more challenging since the additional randomness introduced by weather and consumer behavior. While the former can be modeled by a finite-space space Markov chain, the latter requires an infinite state space.
The design of complex engineered systems requires the concerted effort of diverse stakeholders, each responsible for a part or an aspect of the overall problem. Inconsistencies emerge naturally when different parts of a system design are often concurrently modified without regard for their dependencies. Such dependencies range from easy to detect and fix to requiring costly simulations and rework. This presentation introduces modelling language engineering concepts and in particular linguistic and ontological properties as a means to reason about the relationships between different views/engineering disciplines. Subsequently, we explore the link between consistency management and design processes.
Vector Space Models represent words in a high dimensional space where “semantically” similar words are mapped to neighboring points. Such techniques have a long, rich history in the field of Natural Language Processing (NLP), but all methods depend in some way or another on the Distributional Hypothesis, which states that words that appear in the same contexts share semantic meaning. Here we share our experience in one such Predictive model, Word2vec Skip-Gram, a particularly computationally-efficient scheme for learning word embeddings from raw text.
The shallow neural network implementation of the model applied to a large Life Science corpus results in a Semantic Bio-Knowledge Graph of nodes (corresponding to words/phrases) and edge weights determined by metrics such as the Cosine Distance between vector pairs. We explore various properties exhibited by the Semantic Bio-Knowledge graph and highlight early results that suggest novel associations that are incipient from the data (such as between genes/diseases, drugs/genes, genes/genes etc.). In particular, the Temporal Analysis of the graph yields robust predictions of certain associations well ahead of their actual occurrence in the primary literature. We believe that a larger distributed software system, nferX, built on this model can augment knowledge synthesis and hypothesis generation in an era of exponentially growing literature.
Formal modelling is hard and is often difficult to scale for large and complex systems. As part of the AMADEOS project, a tool was designed to facilitate rapid modelling and simulation of system-of-systems (SoS) using a customization of Google Blockly tool. Blockly has been adopted to ease the design of SoS by means of simpler and intuitive user interface; thus requiring minimal technology expertise and support for the SoS designer. This talk will showcase some of the ideas and results of the tool developed for the AMADEOS project.
The rise of Internet of Things and cyber-physical systems create new challenges in interconnecting variety of devices and people for the systems to function in a holistic manner to address issues of efficiency, robustness and resilience of systems. As these systems become increasingly complex, there is a need to revisit system engineering approaches, standards, and semantic interoperability.
This talk will also provide an example of a Smart City project in Downtown Washington, D.C. and the approach taken from both a city management and technology perspective. Further, the talk will address challenges of interoperability by making the case that category theory provides a possible semantic foundation for engineering of such complex interlinked systems.
Modern control engineering applications such as the development of cooperative adaptive cruise control, regulation of large scale systems, e.g., city-wide water or power distribution systems, etc., require synergistic interaction between sensors, communication channels, controllers, actuators. The use of communication network naturally gives rise to several sources of uncertainties: packet dropouts due to unreliabletransmissions, variable transmission delays, etc., thereby degrading system performance with respect to the ideal model, and may even compromise stability, stabilizability, controllability, etc. Networked control systems(NCS) under uncertainties are conveniently abstracted by mathematical models involving switching for the study both qualitative and quantitative behaviours. In this talk we discuss several algorithms for stabilizing switched systems. We employ graph-theoretic tools, and our algorithms haveboth deterministic and probabilistic flavours.
Large scale distributed systems implementing services such as Google’s search or Amazon’s payment services are sophisticated, complex systems with hundreds of thousands of servers, hundreds to thousands of component services, with redundancy across multiple locations across the globe. One of the most critical problems with operating such systems is the need to find a small set of parameters which, if monitored correctly, will provide reliable, actionable information about the health of the system or services as a whole. This talk will focus on using response time distributions as an indicator of overall system health.
This talk will bring forth the burden of hidden hunger both globally and within the country, evaluate the causality with regard to iron deficiency and provide scientific methodologies to assess and develop feasible approaches to impact hidden hunger at the policy and populations levels.
There are several far reaching changes underway in power systems. Renewable energy sources such as solar and wind are time-varying. To enhance their usage, demand will need to be adjusted to meet supply, rather than the other way around as is traditional. This raises several issues lying at the confluence of economic behavior and elasticity, demand pooling, implicit or explicit storage, information availability, privacy, adaptation and control. At the same time, the increasing potential deployment of PMUs raises issues concerning how to make the huge amount of data intelligible and visualizable to operators, and to make inferences from the data. On another front, the increasing potential usage of electric vehicles raises issues surrounding how to charge them. I will describe some efforts at understanding these problems.
The Internet of Things (IoT) is a vision for a ubiquitous society wherein people and ʺThingsʺ are connected in an immersively networked computing environment. The past couple of years have seen a heightened interest in the IoT space transcending industry, academia and government. Estimates on the spread and economic impact of IoT over the next few years are in the neighborhood of 50 billion or more connected ʺThingsʺ with a market exceeding $350 billion. The enterprise, societal and individual benefits touted are significant, with smarter cities and infrastructure, intelligent appliances, and healthier lifestyles in the offering.
This talk will discuss assumptions and challenges on the path of integrating physical “things” into an interconnected system.
Sensor Network has been evolved through many distinct phases in last two decades. It achieved a set of milestones in this process; however some of the initially hyped topics were strayed around and could not do much as they were envisioned. This talk will summarise outcomes of key focus areas of sensor network of last decade. Subsequently, it will throw some light on “Sensor Network in Smart Grid and Energy Management”. Energy consumption can be reduced by using low consumption devices, by identifying energy wastages and by eliminating it. Using some of the deployment in past, the speaker will explain how wastages of electricity have been identified and eliminated by sensor network technologies and achieved results which were not possible otherwise.
Use of AHPCO (Advanced Hydrated Photo Catalytic Reaction) and plasma nanotechnology to develop devices in reducing aeroallergen and microbial contamination during food processing by Nabarun Gosh, West Texas A&M University, 19 June 2015
Present day business world demands newer technologies to build cheap and efficient marketable products. A decade research in aerobiology and biotechnology developed an air purification system that uses Advanced Hydrated Photo Catalytic Oxidation (AHPCO) and plasma nanotechnology to reduce indoor aeroallergen to improve air quality and better food preservation. Air Oasis air purifiers utilize a new generation AHPCO technology that does not rely on filters or air passing through the air purifier. Innovations in technology continue to have massive impact on business and society. Technology is a process and a body of knowledge as much as a collection of artifacts. Biology is no different—and we are just beginning to comprehend the challenges inherent in the next stage of biology as a human technology.
During the past nine years we have been assessing the AHPCO and plasma nanotechnology for net reduction of bacteria, fungi, VOCs with the specific effect on Methicillin resistant Staphylococcus aureus, MRSA. A 5-year research project carried out at the BSA Hospital and Coulter Animal Hospital in Amarillo, Texas evidenced a gradual reduction of airborne bacteria, aeroallergens and VOCs in the indoor air thereby improving the air quality significantly when specialized air purification units were used in the Microbiology and Mycology Laboratories of the BSA Hospital.
Evaluations on safety measures showed no side effect on human cell cultures. Indoor aeroallergens such as, airborne fungal spores, airborne bacteria and animal dander reduced significantly on using the air purification units to improve indoor air quality and alleviating breathing ailments.
The omnipresence of indoor lighting makes it an ideal vehicle for pervasive communication with mobile devices. First, I will present a technique that enables interior ambient LED lighting systems to communicate with mobile devices and low-power embedded devices in a manner that is imperceptible to people. This system enables the lights to act as landmarks for these devices to identify their location in indoor environments.
The second solution I will present is an ultrasound ranging-based system. The system consists of time-synchronized ultrasonic transmitters that utilize the audio bandwidth just above the human hearing range. An unmodified mobile phone receives these signals to determine its range from the transmitters and estimates its location. This system secured the first-place in the infrastructure-based category in the Microsoft Indoor Localization Competition held at IPSN 2015.
However, setting up any ranging-based system and mapping the beacons is an arduous and labor-intensive process. I will present some early work in automating the beacon mapping process, and motivate the need to start building representations, tools and frameworks that allows fusion of data from different current and future localization technologies.
The availability and fast development of portable sensors and ubiquitous computing capabilities are providing new opportunities to healthcare for sustainable, personalized and readily accesible medical treatment. Parkinson’s disease (PD) is a degenerative neurological condition with severe symptoms mainly affecting mobility. New neuroscience and clinical approaches are showing evidences that it is possible to cope with some of the symptoms, using the brain plasticity still present to enhance motor learning in PD patients. On these bases, in a EU-funded research project (including eight different research centers) , a portable device has been developed, including MEMS sensors (accelerometers and gyroscopes), real-time data elaboration and real-time feedback restitution to the patient. It can administer rehabilitative exercise in an automatic and reliable manner, with the aim of increasing the quality of life of patients.
Combining simulations with formal analysis can improve the design, verification, and validation processes for embedded and cyberphysical systems. In this talk, I will present an overview of the algorithms we have developed to derive bounded-time formal guarantees for interesting models of cyberphysical systems. The algorithms use numerical simulations and static analysis of models. They are always sound and also complete for robust safety and temporal precedence properties. For large networked models, the annotations can be derived compositionally. I will mention the lessons learned and the outstanding challenges in applying them to verify a parallel landing protocol and a cardiac cell-pacemaker network.