CIS Seminars & Events

Fall 2016 Colloquium Series

Unless otherwise noted, our lectures are held weekly on Tuesday and/or Thursday from 3:00 p.m. to 4:15 p.m. in Wu and Chen Auditorium, Levine Hall.

 

Monday, October 24th

Benjamin Pierce
University of Pennsylvania
Levine 307, 1:30pm
"The Science of Deep Specification"

 

Read the Abstract and Bio

Abstract:
Abstraction and modularity underlie all successful hardware and software systems: We build complex artifacts by decomposing them into parts that can be understood separately. Modular decomposition depends crucially on the artful choice of interfaces between pieces. As these interfaces become more expressive, we think of them as specifications of components or layers. Rich specifications based on formal logic are little used in industry today, but a practical platform for working with them could signicantly reduce the costs of system implementation and evolution by identifying vulnerabilities, helping programmers understand the behavior of new components, facilitating rigorous change-impact analysis, and supporting maintainable machine-checked verication that components are correct and fit together correctly. Recently, research in the area has begun to focus on a particularly rich class of specifications, which might be called deep specifications. Deep specifications are rich (describing complex component behaviors in detail); two-sided(connected to both implementations and clients); formal (written in a mathematical notation with clear semantics to support tools such as type checkers, analysis and testing tools, automated or machine-assisted provers, and advanced IDEs); and live (connected directly to the source code of implementations via machine-checkable proofs or property-based random testing). These requirements impose strong functional correctness conditions on individual components and permit them to be connected together with rigorous composition theorems. This talk presents the key features of deep specifications, surveys recent achievements and ongoing efforts in the research community (in particular, work at Penn, Princeton, Yale, and MIT on formalizing a rich interconnected collection of deep specifications for critical system software components), and argues that the time is ripe for an intensive effort in this area, involving both academia and industry and integrating research, education, and community building. The ultimate goal is to provide rigorously checked proofs about much larger artifacts than are feasible today, based on decomposition of proof effort across components with deep specifications.

Bio:

Benjamin Pierce is Henry Salvatori Professor of Computer and Information Science at the University of Pennsylvania and a Fellow of the ACM. His research interests include programming languages, type systems, language-based security, computer-assisted formal verification, differential privacy, and synchronization technologies. He is the author of the widely used graduate textbooks Types and Programming Languages and Software Foundations. He has served as co-Editor in Chief of the Journal of Functional Programming, as Managing Editor for Logical Methods in Computer Science, and as editorial board member of Mathematical Structures in Computer Science, Formal Aspects of Computing, and ACM Transactions on Programming Languages and Systems. He is also the lead designer of the popular Unison file synchronizer.



 


 

October 27th

Deborah Johnson
University of Virginia
"Ethical Issues in Big Data"

 

Read the Abstract and Bio

Abstract:
Big data has enormous potential to generate new knowledge in many different sectors – knowledge that can contribute to better health, more efficient decision-making, better law enforcement, more personalized services, and more.   At the same time, the social and organizational behavior enabled through the use of big data analytical techniques create daunting ethical issues. This presentation begins with a discussion of the Facebook emotional contagion study, and the controversy surrounding it, in an attempt to frame big data analysis using personal data as a form of human subjects research.  In seeking to get to the heart of the ethical challenges, the presentation focuses on privacy and fairness.  The presentation argues that standard accounts of privacy especially those that focus on anonymity and the panoptic gaze are inadequate to deal with the kind of data collection that is now possible.  Much more attention should be focused on the accountability of organizations and their use of personal data and especially the fairness of organizational practices that are enabled by big data. 

Bio:
Deborah G. Johnson is the Anne Shirley Carter Olsson Professor of Applied Ethics in the Science, Technology, and Society Program at the University of Virginia. Best known for her work on computer ethics and engineering ethics, Johnson’s research examines the ethical, social, and policy implications of technology, especially information technology.



 


 

November 1st


Emmanuel Thome
INRIA
"Large cryptanalytic computations with the Number Field Sieve"

 

Read the Abstract and Bio

Abstract:

Cryptogaphy relies on assumptions of hardness of certain mathematical computations. While a would-be attacker sometimes has means to access data that should remain confidential, it is clear that attacking the
these purportedly hard mathematical problems is an achievement whose reach is on a different scale. An undersized key for an asymmetric system could jeopardize the security of millions of users. In this talk, we
discuss several recent (and also less recent) computations we did toassess the feasibility limit of the attacks on these mathematical problems. We are interested in particular on problems such as integer factoring and the discrete logarithm problem over finite fields. These two problem underpin most of today's public-key cryptography. We discuss the algorithms used, and the computational challenges associated to these attacks.

Bio:

E. Thomé is a senior research scientist with INRIA Nancy. He obtained his PhD from École polytechnique in 2003. E. Thomé co-authored several computational records, including the factorization of RSA-768, as
well as more recently a 1024-bit discrete logarithm computation for a special prime. E. Thomé also co-authored the first algorithm of quasi-polynomial complexity for the discrete logarithm problem over
binary fields.


 


 

November 8th

Grace Hopper Lecture Series
Kathleen R. McKeown
Columbia University
"At the Intersection of Data Science and Language"

 

Read the Abstract and Bio

Abstract:

Data science holds the promise to solve many of society’s most pressing challenges, but much of the necessary data is locked within the volumes of unstructured data on the web including language, speech and video. In this talk, I will describe how data science approaches are being used in research projects that draw from language data along a continuum from fact to fiction. I will present research on  learning from knowledge of past disasters, as seen through the lens of the media and on the use of data science in understanding subjective, personal narratives of those who have experienced disaster.  I will conclude with analysis of both social media and novels.


Bio:

Kathleen R. McKeown is the Director of the Data Science Institute  and the Henry and Gertrude Rothschild Professor of Computer Science at Columbia University. She served as Department Chair from 1998-2003 and as Vice Dean for Research for the School of Engineering and Applied Science for two years. A leading scholar and researcher in the field of natural language processing, McKeown focuses her research on big data; her interests include text summarization, question answering, natural language generation, and multilingual applications. She has received numerous honors and awards, including AAAI Fellow, a Founding Fellow of the Association for Computational Linguistics and an ACM Fellow. Early on she received the National Science Foundation Presidential Young Investigator Award, and a National Science Foundation Faculty Award for Women. In 2010, she won both the Columbia Great Teacher Award—an honor bestowed by the students—and the Anita Borg Woman of Vision Award for Innovation.

 McKeown served as secretary and board member of the Computing Research Association. She was president, vice president, and secretary treasurer of the Association of Computational Linguistics (ACL), a member of the Executive Council of the American Association for Artificial Intelligence (AAAI), the co-program chair of AAAI in 1991 and conference chair of ACL in 2008.


 


 

November 10th

Faculty Candidate Talk
Mridul Aanjaneya
University of Wisconsin - Madison
"Simulation-Enhanced Visual Computing for Real World Applications"

 

Read the Abstract and Bio

Abstract:

Computer graphics techniques enable the creation of rich digital content that can react to the external environment in a physically realistic manner. Advances in digital data acquisition and portable display devices carry the promise of extending the capability of these techniques beyond animation to revolutionary new use cases, such as medical diagnosis and treatment, computational design and fabrication, and online education. To unleash the full potential of these methods, however, there is a need for computational algorithms and data structures that allow for high-fidelity simulations in interactive settings. A particularly interesting and challenging aspect of this problem is that of organizing computation on modern hardware platforms that are becoming increasingly heterogeneous, i.e., workstations equipped with several bandwidth-optimized accelerator cards. In this talk, I will detail the steps I have taken towards addressing this challenge. In particular, I will present a data structure that exploits the virtual memory management system to efficiently store and process multiple data channels on highly irregular voxelized domains with over a billion degrees of freedom, I will describe a numerical solver that benefits from the high memory and compute bandwidth of GPU accelerators even for problem sizes that are too large to fit entirely on GPU memory, and I will briefly summarize methods for simulating complex multi-material interactions with dynamic objects.

Bio: Dr. Mridul Aanjaneya is a postdoctoral researcher in the Department of Computer Sciences at the University of Wisconsin - Madison. Prior to joining UW Madison, he obtained his Ph. D. in Computer Science from Stanford University. While at Stanford, he also worked as a consultant in the Spatial Technologies team at the Nokia Research Center for two years. Mridul's research lies at the intersection of Computer Graphics, Scientific Computation, Biomechanics and Applied Mathematics. More specifically, he is interested in the design of models, computational techniques and robust numerical algorithms that can facilitate high-level tasks such as anatomically accurate virtual surgery simulations, computational imaging, or autonomous navigation in complex external environments.

 


 


 

November 15th

Samuel Madden
MIT
"Interactive Data Analytics: the New Frontier"

 

Read the Abstract and Bio

Abstract: Abstract:  Data analytics often involves data exploration, where a data set is repeatedly analyzed to understand root causes, find patterns, or extract insights.   Such analysis is frequently bottlenecked by the underlying data processing system, as analysts wait for their queries to complete against a complex multilayered software stack.  In this talk, I’ll describe some exploratory analytics applications we’ve build in the MIT database group over the past few years, and will then describe some of the challenges and opportunities that arise when building more efficient data exploration systems that will allow these applications to be come truly interactive, even when processing billions of data points.

Bio: Samuel Madden is a Professor of Electrical Engineering and Computer Science in MIT's Computer Science and Artificial Intelligence Laboratory. His research interests include databases, distributed computing, and networking. Madden is a leader in the emerging field of "Big Data", heading the Intel Science and Technology Center (ISTC) for Big Data, a multi-university collaboration on developing new tools for processing massive quantities of data. He also leads BigData@CSAIL, an industry-backed initiative to unite researchers at MIT and leaders from industry to investigate the issues related to systems and algorithms for data that is high rate, massive, or very complex.

Madden received his Ph.D. from the University of California at Berkeley in 2003 where he worked on the TinyDB system for data collection from sensor networks. Madden was named one of Technology Review's Top 35 Under 35 in 2005, and is the recipient of several awards, including an NSF CAREER Award in 2004, a Sloan Foundation Fellowship in 2007, best paper awards in VLDB 2004 and 2007, and a best paper award in MobiCom 2006.


 


 

December 8th

Ragunathan Rajkumar
Carnegie Mellon University
"Driverless Vehicles: What Can We (Not) Do?"

 

Read the Abstract and Bio

Abstract:
Self-driving vehicles are constantly in the news today, with the recent flood of activities having its origins in the DARPA Grand Challenges. Self-driving vehicles, we hear constantly, will revolutionize transportation. In this talk, we will address some basic questions and challenges that must be addressed for the revolution to materialize. What are the technology barriers that must be surmounted before driverless vehicles can take over our public roads and highways? How good is sensing of the transportation landscape? Can connectivity play a role? What non-technical aspects stand in the way? The talk will be based on real-world experiences and spice it with some speculation.

Bio:

Prof. Raj Rajkumar is the George Westinghouse Professor of Electrical & Computer Engineering and Robotics Institute at Carnegie Mellon University. At Carnegie Mellon, he directs the National University Transportation Center for Safety, which is sponsored by the US Department of Transportation. He also directs the Real-Time and Multimedia Systems Laboratory (RTML), and co-directs the General Motors-Carnegie Mellon Connected and Autonomous Driving Collaborative Research Laboratory (CAD-CRL). Raj has served as the Program Chair and General Chair of six international ACM/IEEE conferences on real-time systems, wireless sensor networks, cyber-physical systems and multimedia computing/networking. He has authored one book, edited another book, holds three US patents, and has more than 160 publications in peer-reviewed forums. Eight of these publications have received Best Paper Awards. He has given several keynotes and distinguished lectures at several international conferences and universities. He is an IEEE Fellow, an ACM Distinguished Engineer and a co-recipient of the IEEE Simon Ramo Medal. He has been given an Outstanding Technical Achievement and Leadership Award by the IEEE Technical Committee on Real-Time Systems. Prof Rajkumar’s work has influenced many commercial operating systems. He was also the primary founder of Ottomatika Inc., a company that focused on delivering the core software intelligence for self-driving vehicles. Ottomatika was recently acquired by Delphi. His research interests include all aspects of cyber-physical systems.


.


 


____________________________________________________________________________________

December 13th

Faculty Candidate Talk
Chenfanfu Jiang
UCLA
" Hybrid methods for computer graphics simulation of snow, sand, water, foam and beyond"

Read the Abstract and Bio

Abstract:

Simulation of natural phenomena for virtual worlds and characters is an important aspect of computer graphics that remains extremely challenging. By modeling the complex motion of phenomena such as breaking ocean waves, crumbling sand castles and falling snow, computer graphics researchers can create realistic virtual environments for a wide range of applications. The most challenging natural phenomena are those whose dynamics involve dramatic topological changes and therefore require sophisticated numerical approaches to achieve sufficient accuracy and visual realism. The need for computational efficiency, topological variability, and numerical stability has led my research toward hybrid, Lagrangian/Eulerian methods, particularly the Particle-In-Cell variants. In this talk, I will focus on how important and effective it is to correctly design and model the math and physics behind natural phenomena. Particularly, I will discuss the derivation and application of the Affine Particle-In-Cell Method (APIC) -- the magic behind water simulation in Disney's Moana. I will also show the power of modeling the correct physics of granular materials with the Material Point Method (MPM) -- the workhorse behind Disney’s Frozen. Furthermore, I will demonstrate that simulations done rigorously following scientific principles not only produce astonishing visual effects, but can also be applied to many other fields including mechanical engineering, computer vision, medicine and cognitive science.

Bio:

Chenfanfu Jiang received his Ph.D. in Computer Science at UCLA in 2015. He was awarded the UCLA Engineering School Edward K. Rice Outstanding Doctoral Student for the top PhD in the school of engineering. He is currently a postdoctoral researcher at UCLA, jointly appointed to the departments of Mathematics and Computer Science. His primary research interests include solid/fluid mechanics, scientific computing, numerical methods, physics based simulation and biomechanical modeling. He also works with Walt Disney Animation Studios, Dreamworks, and Center for Advanced Surgical and Interventional Technology (CASIT), applying scientific computing techniques to simulate the dynamics of virtual materials like snow, sand, water, and human tissue.


.