The Computer Science Colloquium series is held each semester and is sponsored by the Computer Science department. Faculty invite speakers from all areas of computer science, and the talks are open to all members of the RPI community.

 

 

2018

May
15
2018
Allocating Scarce Societal Resources Based on Predictions of Outcomes

Abstract:

Demand for resources that are collectively controlled or regulated by society, like social services or organs for transplantation, typically far outstrips supply. How should these scarce resources be allocated? Any approach to this question requires insights from computer science, economics, and beyond; we must define objectives, predict outcomes, and optimize allocations, while carefully considering agent preferences and incentives. In this talk, I will discuss our work on weighted matching and assignment in two domains, namely living donor kidney transplantation and provision of services to homeless households. My focus will be on how effective prediction of the outcomes of matches has the potential to dramatically improve social welfare both by allowing for richer mechanisms and by improving allocations. I will also discuss implications for equity and justice.

 

This talk is based on joint work with Zhuoshu Li, Amanda Kube, Sofia Carrillo, Patrick Fowler, and Jason Wellen.

 

Bio:

Sanmay Das is an associate professor in the Computer Science and Engineering Department at Washington University in St. Louis. Prior to joining Washington University, he was on the faculty at Virginia Tech and at Rensselaer Polytechnic Institute. Das received Ph.D. and S.M. degrees from the Massachusetts Institute of Technology, and an A.B. from Harvard University, all in Computer Science. Das' research is in artificial intelligence and machine learning, and especially their intersection with finance, economics, and the social sciences. He has received an NSF CAREER Award and the Department Chair Award for Outstanding Teaching at Washington University. Das has served as program co-chair of AAMAS and AMMA, and regularly serves on the senior program #committees of major AI conferences like AAAI and IJCAI.

Professor Sanmay Das, Washington University
Sage Laboratory, Rm 4101 4:00 pm

Apr
17
2018
Enriching Social Network Analysis for Adversarial Settings

Abstract:

Although each individual thinks that the decision to form a link in a social network is an autonomous, local decision, we know that this isn’t so. Larger forces are involved: human mental limitations (Dunbar’s Number), social balance that induces triangle closure and resists certain signed triads (Georg Simmel), and the flow of properties ‘over the horizon’ within networks (Christakis). Social network analysis has three phases: aggregating individual link decisions, understanding the global structure that results, and revisiting each link in the context of this global structure; and both the second and third phases provide insights.

The relationships associated with links are much richer than most social network analysis recognises: relationships can be of qualitatively different types (friend, colleague), directed (so the intensity one way is different from the intensity the other), signed (both positive and negative, friend or enemy), and dynamic (changing with time). All of these extra properties, and their combinations, can be modelled using a single scheme, creating an expanded social network, which can then be analysed in conventional ways (for example, by spectral embedding).

Social network analysis is especially useful in adversarial settings (where the interests of those doing the modelling and [some of] those being modelled are not aligned) because each individual cannot control much of the global structure. I will illustrate how this pays off in law enforcement and counterterrorism settings.

Professor David Skillcorn, Queen's University
Troy Bldg, Room 2018 4:00 pm

Apr
12
2018
Using Theory and Data for Better Decisions

Abstract:

The internet and modern technology enables us to communicate and interact at lightening speed and across vast distances.  The communities and markets created by this technology must make collective decisions, allocate scarce resources, and understand each other quickly, efficiently, and often in the presence of noisy communication channels, ever changing environments, and/or adversarial data.  Many theoretical results in these areas are grounded on worst case assumptions about agent behavior or the availability of resources. Transitioning theoretical results into practice requires data driven analysis and experiment as well as novel theory with assumptions based on real world data. I'll discuss recent work that focus on creating novel algorithms for including a novel, strategyproof mechanism for selecting a small subset of winners amongst a group of peers and algorithms for resource allocation with applications ranging from reviewer matching to deceased organ allocation. These projects require novel algorithms and leverage data to perform detailed experiments as well as creating open source tools.

Bio:

Nicholas Mattei is a Research Staff Member in the Cognitive Computing Group the IBM TJ Watson Research Laboratory.  His research is in artificial intelligence (AI) and its applications; largely motivated by problems that require a blend of techniques to develop systems and algorithms that support decision making for autonomous agents and/or humans. Most of his projects and leverage theory, data, and experiment to create novel algorithms, mechanisms, and systems that enable and support individual and group decision making. He is the founder and maintainer of PrefLib: A Library for Preferences; the associated PrefLib:Tools available on Github; and is the founder/co-chair for the Exploring Beyond the Worst Case in Computational Social Choice (2014 - 2017) held at AAMAS.


Nicholas was formerly a senior researcher working with Prof. Toby Walsh in the AI & Algorithmic Decision Theory Group at Data61 (formerly known as the Optimisation Group at NICTA).  He was/is also an adjunct lecturer in the School of Computer Science and Engineering (CSE) and member of the Algorithms Group at the University of New South Wales. He previously worked as a programmer and embedded electronics designer for nano-satellites at NASA Ames Research Center. He received his Ph.D from the University of Kentucky under the supervision of Prof. Judy Goldsmith in 2012.

Refreshments served at 3:30 p.m.

Dr. Nicholas Mattei, IBM T. J. Watson Research Laboratory
Bruggeman Conference Room, CBIS 4:00 pm

Apr
5
2018
Massive-Scale Analytics

Reception at 3:30pm

Abstract:

Emerging real-world graph problems include: detecting community structure in large social networks; improving the resilience of the electric power grid; and detecting and preventing disease in human populations. Unlike traditional applications in computational science and engineering, solving these problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for additional research on scalable algorithms and development of frameworks for solving these problems on high performance computers, and the need for improved models that also capture the noise and bias inherent in the torrential data streams. In this talk, the speaker will discuss the opportunities and challenges in massive data-intensive computing for applications in computational science and engineering.

Biography:

David A. Bader is Professor and Chair of the School of Computational Science and Engineering, College of Computing, at Georgia Institute of Technology. He is a Fellow of the IEEE and AAAS and served on the White House's National Strategic Computing Initiative (NSCI) panel. Dr. Bader served as a board member of the Computing Research Association, on the NSF Advisory Committee on Cyberinfrastructure, on the Council on Competitiveness High Performance Computing Advisory Committee, on the IEEE Computer Society Board of Governors, on the Steering Committees of the IPDPS and HiPC conferences, and as editor-in-chief of IEEETransactions on Parallel and Distributed Systems, and is a National Science Foundation CAREER Award recipient. Dr. Bader is a leading expert in data sciences. His interests are at the intersection of high-performance computing and real-world applications, including cybersecurity, massive-scale analytics, and computationalgenomics, and he has co-authored over 210 articles in peer-reviewed journals and conferences. During his career, Dr. Bader has served as PI/coPI of over $180M of competitive awards. Dr. Bader has served as a lead scientist in several DARPA programs including High Productivity Computing Systems (HPCS) with IBM PERCS, Ubiquitous High Performance Computing (UHPC) with NVIDIA ECHELON, Anomaly Detection at Multiple Scales (ADAMS), Power Efficiency Revolution For Embedded Computing Technologies (PERFECT), and Hierarchical Identify Verify Exploit (HIVE). He has also served as Director of the Sony-Toshiba-IBM Center of Competence for the Cell Broadband Engine Processor. Bader is a co-founder of the Graph500 List for benchmarking "Big Data" computing platforms. Bader is recognized as a "RockStar" of High Performance Computing by InsideHPC and as HPCwire's People to Watch in 2012 and 2014. Dr. Bader also serves as an associate editor for several high impact publications including IEEE Transactions on Computers, ACM Transactions on Parallel Computing, and ACM Journal of Experimental Algorithmics. He successfully launched his school's Strategic Partnership Program in 2015, whose partners include Accenture, Booz Allen Hamilton, Cray, IBM, Keysight Technologies, LexisNexis, Northrop Grumman, NVIDIA, and Yahoo; as well as the National Security Agency, Sandia National Laboratories, Pacific Northwest National Laboratory, and Oak Ridge National Laboratory.

Flaherty Lecture Series
David A. Bader, Professor and Chair, School of Computational Science and Engineering, Georgia Institute of Technology
CBIS Auditorium 4:00 pm

Apr
3
2018
The Imagination of Ada Lovelace: An Experimental Humanities Approach

Abstract:
In over 200 years since Ada Lovelace's birth, she has been celebrated, neglected, and taken up as a symbol for any number of causes and ideas. This talk traces some paths the idea of Lovelace and her imagination of Charles Babbage's Analytical Engine has taken. In particular, we focus on music and creativity, after Lovelace's idea that 'the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent'.
This work began at a symposium in 2015 to mark the 200th anniversary of Lovelace's birth. Through collaborations with Pip Willcox (head of the Centre for Digital Scholarship at the Bodleian Libraries), composer Emily Howard (Royal Northern College of Music), and mathematician Ursula Martin, we have conducted a series of experiments and demonstrations inspired by the work of Lovelace and Babbage. These include simulations of the Analytical Engine, use of a web-based music application, construction of hardware, reproduction of earlier mathematical results using contemporary computational methods, and a musical performance based on crowd sourced algorithmic fragments.
Recently we have introduced Charles Wheatstone to our experiments: Wheatstone deployed an electric telegraph in the same period, but could an electromechanical computer have been constructed? These digital experiments bring insight and engagement with historical scenarios. Our designed digital artefacts can be viewed as design fictions, or as critical works explicating our interpretation of Lovelace’s words: digital prototyping as a means of close-reading. We frame this as Experimental Humanities.

Bio:
David De Roure is a Professor of Computer Science in the School of Electronics and Computer Science at the University of Southampton, UK. A founding member of the School's Intelligence, Agents, Multimedia Group, he leads the e-Research activities and is Director of the Pervasive Systems Centre. David's work focuses on creating new research methods in and between multiple disciplines, particularly through the codesign of new tools. His projects draw on Web 2.0, Semantic Web, workflow and scripting technologies; he pioneered the Semantic Grid initiative and is an advocate of Science 2.0. He is closely involved in UK e-Science and e-Social Science programmes in projects including myExperiment, CombeChem, myGrid, LifeGuide, e-Research South and the Open Middleware Infrastructure Institute. David has worked for many years with distributed information systems and distributed programming languages, with leading roles in the Web, hypertext and Grid communities. He is a Scientific Advisory Council member of the Web Science Research Initiative and a Fellow of the British Computer Society.

Professor David De Roure, Oxford e-Research Centre, University of Oxford
Winslow Bldg, Rm 1140 12:30 pm

Mar
20
2018
A Framework for Structural Input/Output and Control Configuration Selection of Complex Networks

The control network design consists mainly of two steps: input/output (I/O) selection and control configuration (CC) selection. The first one is devoted to the problem of computing how many actuators/sensors are needed and where should be placed in the plant to obtain some desired property. Control configuration is related to the decentralized control problem and is dedicated to the task of selecting which outputs (sensors) should be available for feedback and to which inputs (actuators) in order to achieve a predefined goal. The choice of inputs and outputs affects the performance, complexity and costs of the control system. Due to the combinatorial nature of the selection problem, an efficient and systematic method is required to complement the designer intuition, experience and physical insight.

Motivated by the above, this presentation addresses the structure control system design taking explicitly into consideration the possible application to large - scale systems. We provide an efficient framework to solve the following major minimization problems: i) selection of the minimum number of manipulated/measured variables to achieve structural controllability/observability of the system, and ii) selection of the minimum number of measured and manipulated variables, and feedback interconnections between them such that the system has no structural fixed modes. Contrary to what would be expected, we showed that it is possible to obtain the global solution of the aforementioned minimization problems in polynomial complexity in the number of the state variables of the system. To this effect, we propose a methodology that is efficient (polynomial complexity) and unified in the sense that it solves simultaneously the I/O and the CC selection problems. This is done by exploiting the implications of the I/O selection in the solution to the CC problem.

Computer Science and the NeST Center
Professor Sergio Pequito, Industrial Systems & Engineering , Rensselaer Polytechnic Institute
Troy Bldg, Room 2018 4:00 pm

2017

Nov
14
2017
Finding Sources in Spreading Processes

The phenomenon of spreading is very broad: it can mean spreading of electromagnetic waves, wind-blown seeds, diseases or information. Spreading can also happen in various environments, from simple spatial (like waves on water) to complicated networks (like information in society). Quite often, it is evident there is a spread, but it is not directly known where it came from. In case of physical phenomena, that feature constant-velocity spreading in space, it may require simple triangulation to pinpoint the source. But if the process happens in a complex network and it also has nature so complex as to be only possible to describe in a stochastic manner (such as epidemics or information spreading) then finding a source becomes a more complicated problem.

Both epidemics and information spreading share a common property - there is no total mass, energy, count, or volume conserved, as is for example in spreading waves (energy) or diffusing substances (mass/volume). Because of that, both can be modeled in similar way, for example by a basic SI (Susceptible-Infected) model. The presentation will describe some existing methods to find sources of spread in such processes and focus on a method based on maximum likelihood introduced by Pinto et.al. as well as describe derivative methods that feature much better performance, as well as improvements in accuracy.

Assume we have a network, where information or epidemic has spread, and we only know when it arrived in some specific points in the network (which we call observers). Further assumptions are that spreading always happens along shortest paths from source to any node, and that the delays on links can be approximated by normal distribution. It is then possible for each potential source in the network (every node in general) to calculate expected arrival times as well as the variances and covariance of these times. For each node we therefore have a multivariate normal distribution of arrival times at all observers. Comparing the distributions with actual observation, we can find the source that has distribution best fitting observed arrival times.

One of drawbacks of such method is that for large number N of nodes in network, the computational costs gets prohibitive, with up to O(N4) complexity. We have proposed a derivative method that limits the considered observers, as well as smart choice of suspected nodes. We only take into account ~  observers that are closest to real source (have earliest observed time of infection), greatly decreasing computational complexity. Instead of calculating distribution for every node, we start with closest observer and follow a gradient of increasing likelihood. These changes not only greatly increase performance (complexity at worst O(N2log(N)) but also increase accuracy in scale-free network topologies.

Bio:
Krzysztof Suchecki is an assistant professor at Warsaw University of Technology, Faculty of Physics. He has MSc and PhD degrees in physics. His research topics focus on dynamics of and on complex networks, such as Ising and voter model dynamics, including co-evolution of network structure and dynamical node states. His current research is focused on spreading of information in networks and methods to identify sources of information.

The NeST Center and Computer Science
Prof. Krzysztof Suchecki, Faculty of Physics, Warsaw University of Technology, Poland
CII (LOW) 3051 4:00 pm

Nov
8
2017
WALA Everywhere

The Watson Libraries for Analysis (WALA) started life as analysis for Java bytecode, but soon grew to include JavaScript and .NET bytecode as well. I shall briefly summarize this history and the design decisions that grew out of it, because those decisions enabled further expansion after WALA became open source: from Android to nodejs, WALA handles more systems, much of the work being done by the community and contributed to the open source project. I shall talk about 3 of those expansions. The first is analysis of hybrid mobile applications, in which Android bytecode and JavaScript source code are analyzed together to create a cross-language analysis result; I shall present the WALA architecture that enables such analysis and talk about how such analysis has been used so far. The second is WALA Client, in which much of WALA is run in a Web browser, including a version of our JavaScript analysis, potentially enabling live analysis of visited Web sites. I shall discuss how this is accomplished technically, and I shall show it running as part of the talk. Third, I shall discuss our ongoing work building analysis of Apple’s Swift language. Throughout this talk, I shall try to bring out how WALA design decisions made in the beginning have
enabled this expansion, and how an increasing Community of researchers have made it possible.

Bio:
Julian Dolby is a Research Staff Member at the IBM Thomas J. Watson Research Center, where he works on a range of topics, including static program analysis, software testing, concurrent programming models and the semantic Web. He is one of the primary authors of the publically available Watson Libraries for Analysis (WALA) program analysis infrastructure, and his recent WALA work has focused on creating the WALA Mobile infrastructure.

His program analysis work has recently focused on scripting languages like JavaScript and on security analysis of Web applications; this work has been included in IBM products, most notably Rational AppScan, Standard Edition and Source Edition. He was educated at the University of Illinois at Urbana-Champaign as a graduate student where he worked with Professor Andrew Chien on programming systems for massively-parallel machines.

 

Dr. Julian Dolby, Thomas J. Watson Research Center, Yorktown Heights, NY
Sage 3510 4:00 pm

Aug
31
2017
Professor Santo Fortunato, Director Center for Complex Networks and Systems Research, Indiana University
Bruggeman Conference Center, Center for Biotechnology and Interdisciplinary Studies (CBIS) 4:00 pm

May
4
2017
Data to Decisions for the Next Generation of Complex Engineered Systems

Abstract:

The next generation of complex engineered systems will be endowed with sensors and computing capabilities that enable new design concepts and new modes of decision-making. For example, new sensing capabilities on aircraft will be exploited to assimilate data on system state, make inferences about system health, and issue predictions on future vehicle behavior---with quantified uncertainties---to support critical operational decisions. However, data alone is not sufficient to support this kind of decision-making; our approaches must exploit the synergies of physics-based predictive modeling and dynamic data. This talk describes our recent work in adaptive and multifidelity methods for optimization under uncertainty of large-scale problems in engineering design. We combine traditional projection-based model reduction methods with machine learning methods, to create data-driven adaptive reduced models. We develop multifidelity formulations to exploit a rich set of information sources, using cheap approximate models as much as possible while maintaining the quality of higher-fidelity estimates and associated guarantees of convergence."

Bio:

Karen E. Willcox is Professor of Aeronautics and Astronautics at the Massachusetts Institute of Technology. She is also Co-Director of the MIT Center for Computational Engineering and formerly the Associate Head of the MIT Department of Aeronautics and Astronautics. Before joining the faculty at MIT, she worked at Boeing Phantom Works with the Blended-Wing-Body aircraft design group. Willcox is currently Co-director of the Department of Energy DiaMonD Multifaceted Mathematics Capability Center on Mathematics at the Interfaces of Data, Models, and Decisions, and she leads an Air Force MURI on optimal design of multi-physics systems. She is also active in education innovation, serving as co-Chair of the MIT Online Education Policy Initiative, co-Chair of the 2013-2014 Institute‑wide Task Force on the Future of MIT Education, and lead of the Fly-by-Wire project developing blended learning technology as part of the Department of Education's First in the World program.


 

Flaherty Lecture Series
Karen Wilcox, Professor of Aeronautics and Astronautics, Massachusetts Institute of Technology
Isermann Auditorium 4:00 pm

Apr
25
2017
A Duality Based Framework for Bayesian Mechanism Design

Abstract:

A central theme in mechanism design is understanding the tradeoff between simplicity and optimality of the designed mechanism. An important and challenging task here is to design simple multi-item mechanisms that can approximately optimize the revenue, as the revenue-optimal mechanisms are known to be extremely complex and thus hard to implement. Recently, we have witnessed several breakthroughs on this front obtaining simple and approximately optimal mechanisms when the buyers have unit-demand (Chawla et. al. '10) or additive (Yao '15) valuations. Although these two settings are relatively similar, the techniques employed in these results are completely different and seemed difficult to extend to more general settings. In this talk, I will present a principled approach to design simple and approximately optimal mechanisms based on duality theory. Our approach unifies and improves both of the aforementioned results, and extends these simple mechanisms to broader settings, i.e. multiple buyers with XOS valuations.
Based on joint work with Nikhil R. Devanur, Matt Weinberg and Mingfei Zhao.

Bio:

Yang Cai is a William Dawson Assistant Professor of Computer Science at McGill University. He received his PhD in computer science from MIT, advised by Costis Daskalakis. He was a postdoctoral researcher in UC Berkeley. Yang’s research interests lie in the area of theoretical computer science, in particular algorithmic game theory, online algorithms and logic.


 

Yang Cai, Assistant Professor, McGill University
Troy 2018 4:00 pm

Apr
6
2017
UNVEIL: A Large-Scale, Automated Approach to Detecting Ransomware

Abstract: Although the concept of ransomware is not new (i.e., such attacks date back at least as far as the 1980's), this type of malware has recently experienced a resurgence in popularity. In fact, in 2014 and 2015, a number of high-profile ransomware attacks were reported, such as the large-scale attack against Sony that prompted the company to delay the release of the film, "The Interview". Ransomware typically operates by locking the desktop of the victim to render the system inaccessible to the user, or by encrypting, overwriting, or deleting the user's files. However, while many generic malware detection systems have been proposed, none of these systems have attempted to specifically address the ransomware detection problem. In this keynote, I talk about some of the trends we are seeing in ransomware. Then, I present a novel dynamic analysis system called UNVEIL that is specifically designed to detect ransomware. The key insight of the analysis is that in order to mount a successful attack, ransomware must tamper with a user's files or desktop. UNVEIL automatically generates an artificial user environment, and detects when ransomware interacts with user data. In parallel, the approach tracks changes to the system's desktop that indicate ransomware-like behavior. Our evaluation shows that UNVEIL significantly improves the state of the art, and is able to identify previously unknown evasive ransomware that was not detected in the anti-malware industry.

Bio: Engin Kirda holds the post of professor of computer science at Northeastern University in Boston. Before that, he held faculty positions at Institute Eurecom in the French Riviera and the Technical University of Vienna, where he co-founded the Secure Systems Lab that is now distributed over five institutions in Europe and the United States. Professor Kirda's research has focused on malware analysis (e.g., Anubis, Exposure, and Fire) and detection, web application security, and practical aspects of social networking security. He co-authored more than 100 peer-reviewed scholarly publications and served on the program #committees of numerous well-known international conferences and workshops. Professor Kirda was the program chair of the International Symposium on Recent Advances in Intrusion Detection (RAID) in 2009, the program chair of the European Workshop on Systems Security (Eurosec) in 2010 and 2011, the program chair of the well-known USENIX Workshop on Large Scale Exploits and Emergent Threats in 2012, and the program chair of security flagship conference Network and Distributed System Security Symposium (NDSS) in 2015. In the past, Professor Kirda has consulted the European Commission on emerging threats, and recently gave a Congressional Breifing in Washington, D.C. on advanced malware attacks and cyber-security. He also spoke at SXSW Interactive 2015 about "Malware in the Wild" and at Blackhat 2015. Besides his roles at Northeastern, Professor Kirda is a co-founder of Lastline Inc., a Silicon-Valley based company that specializes in the detection and prevention of advanced targeted malware.

TBD 4:00 pm

Feb
28
2017
A Complex Networks Approach to Data Science: Modeling, Representation and Analysis of Interconnected Large-Scale Data Structures

Abstract:  Driven by modern applications and the abundance of empirical network data, network science aims at analysing real-world complex systems arising in the social, biological and physical sciences by abstracting them into networks (or graphs).  The size and complexity of real networks has produced a deep change in the way that graphs are approached, and led to the development of new mathematical and computational tools.  In this talk, I will present a data-driven network-based methodology to approach data analysis problems arising in a variety of contexts while highlighting state of the art network models including spacial, multilayer, interdependant and modular networks.  I will describe different stages in the analysis of data starting from (1) network representations of urban transportation systems, then (2) inference of structural patterns in networks and phase transitions in their detectability, and finally (3) understanding the implications of structural features to dynamical processes taking place on networks.  I will conclude with a discussion of the future directions of my work and its intersection with complexity theory, machine learning and statistics.

 

Bio:  I received my Bachelors of Science degree, cum laude, in Mathematics and Computer Science (double degree) from the Israel Institute of Technology (Technion) in 2008.  I then shifted my focus towards acquiring Industrial experience and worked as a Software Engineer at Diligent Technologies, an Israel startup that was acquired by IBM.  I completed my PhD in Computer Science at the University of St Andrews, UK under the guidance of Simon Dobson in 2014.  During my PhD I was supported bya full prize scholarship from the Scottish Informatics and COmputer Science Alliance (SICSA).  In 2014 I joined Peter Mucha's group in the Department of Mathematics at the University of North Carolina at Chapel Hill as a Postdoctoral Scholar.  My research has been focused on the development of mathematical and computational tools for modeling and analyzing complex systems, roughly defined as large networks of simple components with no central control that give rise emergent complex behavior.  I believe that looking at data through a "network lens" provides us with useful perspectives on diverse problems, from designing optimal transportation systems and smart cities to clustering high-demensional data.  I am constantly looking for new datasets on which I can apply my "network machinery" to solve real-world problems and to inspire the development of new methodologies.

Dr. Saray Shai, University of North Carolina, Chapel Hill
Troy 2018 4:00 pm

Feb
21
2017
Data-Driven Behavioral Analytics with Networks

Abstract: Thanks to Information Technologies, online user behaviors are broadly recorded at an unprecedented level.  This gives us an opportunity for getting insights into human behaviors and our societies from real data of large scale enough that makes manual analysis and inspection completely impractical in building intelligent and trustworthy systems.  In this talk I will discuss about research problems, challenges, priciples, and methodologies of developing network-based computational models for behavioral analysis.  Specifically, I will present recent approaches on (1) modeling user behavior intentions with knowledge from social and behavioral sciences for behavior prediction, recommendation, and suspicious behavioral detection, (2) modeling social and spatiotemporal information for knowledge from behavioral contexts, (3) Structuring behavioral content into information networks of entities and attributes, and (4) integrating structured and unstructured behavior data to support decision-making and information systems.  Two results, CatchSync and MetaPAD, will be presented in details.  I will conclude with my thoughts on future directions.

 

Bio:  Dr. Meng Jiang is a postdoctoral researcher in University of Illinois at Urbana-Champaign.  He received his Ph.D. from the Department of Computer Science at Tsinghua University in 2015.  He obtained his bachelor degree from the same department in 2010.  He visited Carnegie Mellon University in 2013 and vistited University of Maryland, College Park in 2016.  Find more about him here: http://www.meng-jiang.com.

His research lies in the field of data mining, focusing on user behavior modeling.  He has delivered two book chapters and two conference tutorials on this topic.  His Ph.D. thesis won the Dissertation Award at Tsinghua.  His work on "Suspicious Behavior Detection" was selected as one of the Best Paper Finalists in KDD '14.  His work on "Social Contextual Recommendation" has been deployed in the Tencent social network.  The package of his work on "Attribute Discovery from Text Corpora" is transferring to U.S. Army Research Lab.  He has published 20 papers with 560+ citations, including 15 papers with 360+ citations as the first author.

Dr. Meng Jiang, Postdoctoral Researcher, University of Illinois at Urbana-Champaign
Troy 2018 4:00 pm

Feb
16
2017
Structures and Dynamics of Complex Systems - Network Resilience, Robustness, and Control

Abstract: Complex systems exist in almost every aspect of science & technology.  My research question focuses on how to understand, predict, control, and ultimately survive real-world complex systems in an ever-changing world facing the global challenges of climate change, weather extremes, and other natural and human distasters.  I will present three recent works in the field of network science and complex systems: resilience, robustness, and control.  (I) Resilience, a system's ability to adjust its activity to retain its basic functionality when errors and environmental changes occur, is a defining property of many complex systems.  I will show a set of analytical tools with which to identify the natural control and state parameters of a multi-dementional complex system, helping us derive an effective one-dimensional dynamics that accurately predicts the system's resilience.  The analytical results unveil the network characteristics that can enhance or diminish resilience, offering ways to prevent the collapse of ecological, biological, or economic systems, and guiding the design of technological systems resilient to both internal failures and environmental changes.  (II) Increasing evidence shows that real-world systems interact with one another, and the real goal in network science shouldn't just understand individual networks, but deciphering the dynamical interactions in networks of networks (NONs).  Malfunction of a few nodes in one network layer can cause cascading failures and catastrophic collapse of the entire system.  I will show the general theoretical framework for analyzing the robustness of and cascading failures in NONs.  The results of NONs have been surprisingly rich, and they differ from those of single networks that they present a new paradigm.  (III) Controlling complex networks is the ultimate goal of understanding the dynamics of them.  I will present a k-walk theory and greedy algorithm for target control of complex networks.  Extending from the three aspects of current research, I will describe the two future research directions: universality of competitive dynamics, and control and recovery of damaged complex systems.

 

Bio:  Dr. Jianxi Gao is a research assistant professor in the Center for Complex Network Research at Northeastern University.  Dr. Gao received his Ph.D. degree at Shanghai Jiao Tong University from 2008 - 2012.  During his Ph.D. he was a visiting scholar at Prof. H. Eugene Stanley's lab at Boston University from 2009 - 2012.  Dr. Gao's major contribution includes the theory for robustness of networks of networks and resilience of complex networks.  Since 2010, Dr. Gao has published over 20 journal papers in Nature, Nature Physics, Nature COmmunications, Proceedings of the National Academy of Sciences, Physical Review Letters and more, with over 15 hundred citations on Google SCholar.  Dr. Gao has been selected as the Editor board of Nature Scientific Reports, distinguished referee of EPL (2014-2016), and referee of Science, PNAS, PRL, PRX and more.  His publications were reported over 20 times by international public and professional media.

Bruggeman Conference Room, CBIS 4:00 pm

Feb
14
2017
Fusion of Multiple Heterogeneous Social Networks for Synergistic Knowledge Discovery

Abstract: Whether the people we follow on Twitter can be recommended as our potential friends on Facebook?  How is the box office that US movies can acheive in China?  How do weather and nearby points-of-interest (POIs) affect the traffic routes planned for vehicles?  About the same information entities, a large amount of information can be collected from various sources, each of which provides a specific signature of the entity from a unique underlying aspect.  Effective fusion of these different information sources provides an opportunity for understanding the information entities more comprehensively.

My thesis works investigate the priciples, methodologies and algorithms for knowledge discovery across multiple aligned information sources, and evaluate the corresponding benefits.  Fusing and mining multiple information sources of large Volumes and diverse Varieties is a fundamental problem in Big Data studies.  In this talk, I will discuss about the information fusion and synergistic knowledge discovery works, focusing on online social media, and present my algorithmic works on multi-source learning frameworks together with the evaluation results.  I will also provide my future vision of fusion learning for broader real-world applications at the conclusion of the talk.

 

Bio:  Jiawei Zhang is a Ph.D. candidate at the Department of Computer Science at University of Illinois at Chicago (UIC), under the supervision of Prof. Philip S. Yu since August 2012.  Prior to joining UIC, he obtained his Bashelor's Degree in Computer Science from Nanjing University, in China.  His research interests span the fields of Data Science, Data Mining, Network Mining, and Machine Learning.  His research works focus on fusing multiple large-scale information sources of diverse varieties together, and carrying out synergistic data mining tasks across these fused sources in one unified alalytic.

His fusion learning works have appeared in KDD, ICDM, SDM, ECLM/PKDD, IJCAI, WWW, WSDM, CIKM, IEEE Transactions on Knowledge and Data Engineering (TKDE).  He receives the Best Student Paper Runner Up Award from ASONAM' 16.  He has been serving as the information specialist and director of ACM Transaction on KNowledge Discovery Data (TKDD) since August 2014.  His is also the PC member of WWW' 17, KDD' 16, CIKM' 16, CIKM' 15, and AIRS' 16.  Besides the academic experience at University Illinois Chicago, he also has industrial research experiences working at Microsoft Research in 2014, IBM T.J. Watson Research Center in 2015.

Troy 2018 4:00 pm

Feb
9
2017
Latent geometry in networked systems: from Internet interdomain routing to human diseases

The prediction and control of the dynamics of networked systems is one of the central problems in network science.  Structural and dynamical similarities of different real networks suggest that some universal laws might accurately describe the dynamics of these networks, though the nature and common origins of such laws remain elusive.  Do these universal laws exist?  We do not have the answer to this question...yet.  I will talk about the latent geometry approach to network systems, which, in my opinion, could be a first step toward the formulation of universal laws of network dynamics.  In this approach, networks underlying complex systems are viewed as discretizations of smooth geometric splaces.  Network nodes are points in these spaces and the probibility of a connection between them.  I will start my talk with a motivation and a high level introduction to the latent geometry concept.  I will continue with a (semi) rigorous discussion of the mathematics underlying the approach and computational algorithms for uncovering latent geometries of real systems.  I will conclude my talk by describing existing and prospective applications of the latent geometry, including Internet interdomain routing, large-scale dynamics of networked systems, human diseases and social dynamics.

 

Bio: Dr. Kitsak is an associate research scientist in the Department of Physics and the Network Science Institute at Northeastern University.  Dr. Kitsak earned Ph.D. in theoreticsl physics from Boston University in 2009 under the direction of Prof. H.E. Stanley.  Dr. Kitsak has held postdoctoral positions at the Center for Applied Internet Data Analysis (CAIDA), UC San Diego (2009-2012); and the Center for Complex Network Research (CCNR), Northeastern University (2012-2014).  His research focuses on the development of theoretical and computational approaches to networked systems.

Library Fischbach Room 4:00 pm

Feb
7
2017
Probabilistic Programming: Past, Present, and Future

Abstract:  Probabilistic reasoning lets you predict the future, infer past causes of current observations, and learn from experience.  It can be hard to implement a probabilistic application because you have to implement the representation, inference, and learning algorithms.  Probabilistic programming makes this much easier by providing an expressive language to represent models as well as inference and learning algorithms that automatically apply to models written in the language.  In this talk, I will present the past, present, and future of probabilistic programming and our Figaro probabilistic programming system.  I will start with the motivation for probabilistic programming and Figaro.  After presenting some basic Figaro concepts, I will introduce several applications we have been developing at Charles River Analytics using Figaro.  Finally, I will describe our future vision of providing a probabilistic programming tool that domain experts with no machine learning knowledge can use.  In particular, I will present a new inference method that is designed to work well on a wide variety of problems with no user configuration.  Prior knowledge of machine learning is not required to follow this talk.

Bio: Dr. Avi Pfeffer is Chief Scientist at Charles River Analytics.  Dr. Pfeffer is a leading researcher on a variety of computational intelligence techniques including probabilistic reasoning, machine learning, and computational game theory.  Dr. Pfeffer has developed numerous innovative probabilistic representation and reasoning frameworks, such as probabilistic programming, which enables the development of probabilistic models using the full power of programming languages, and statistical relational reasoning.  He is the lead developer of Charles River Analyics' Figaro probabilistic programming language.  As an Associate Professor at Harvard, he developed IBAL, the first general-purpose probabilistic programming language.  While at Harvard, he also produced systems for representing, reasoning about, and learning beliefs, preferences, and decision making strategies of people in strategic situations.  Prior to joining Harvard, he invented object-oriented Bayesian networks and probabilistic relational models, which form the foundation of the field of statistical relational learning.  Dr. Pfeffer serves as Program Chair of Conference on Uncertainty in Artificial Intelligence.  He has published many journal and conference articles and is the author of a text on probabilistic programming.  Dr. Pfeffer received his Ph.D. in computer science from Stanford University and his B.A. in computer science from the University of California, Berkeley.
 

Dr. Avi Pfeffer, Chief Scientist at Charles River Analytics
Troy 2018 4:00 pm

Jan
31
2017
Cognitive and Immersive Systems Lab (CISL)

With great progress made in the areas of Cognitive Computing and Human-scale Sensory Research, the paradigm of human computer interaction will be a collaboration between human beings and intelligent machines through a human-scale and immersive experience.  What research challenges exists today to enable this new paradigm, how it will help human beings (especially groups of people), what transformation it will bring to us, are all important to us.  In this talk, Dr. Su is going to introduce the vision of future Cognitive and Immersive Situations Room - an immersive, interactive, intelligent physical environment, and the research agenda to realize the vision.

 

Bio: Dr. Hui Su is the Director of Cognitive and Immersive Systems Lab, a collaboration between IBM Research and Rensselaer Polytechnic Institute.  He has been a technical leader and an executive at IBM Research.  Most recently, he was the Director of IBM Cambridge Research Lab in Cambridge, MA and was responsible for a broad scope of global missions in IBM Research, including Cognitive User Experience, Center for Innovation in Visual Analytics and Center for Social Business.  As a technical leader and a researcher for 20 years at IBM Research, Dr. Su has been an expert in multiple areas ranging from Human Computer Interaction, Cloud Computing, Visual Analytics, and Neural Network Algorithms for Image Recognition etc.  As an executive, he has been leading research labs and research teams in the U.S. and China.

EMPAC Studio 2 4:00 pm

2016