|Speaker :||Wojciech Szpankowski|
|Time:||2:00 pm - 3:00 pm|
|Location:||LINCS Meeting Room 40|
“Information is the distinguishing mark of our era, permeating every facet of our lives. An ability to understand and harness information has the potential for significant advances. Our current understanding of information dates back to Claude Shannon’s revolutionary work in 1948, resulting in a general mathematical theory of reliable communication that not only formalized the modern digital communication and storage principles but also paved the way for the Internet, DVDs and iPods of today. While Shannon’s information theory has had profound impact, its application beyond storage and communication poses foundational challenges. In 2010 the National Science Foundation established the Science & Technology Center for the Science of Information to meet the new challenges posed by the rapid advances in networking, biology and knowledge extraction. Its mission is to advance science and technology through a new quantitative understanding of the representation, communication and processing of information in biological, physical, social and engineering systems. Purdue University leads nine partner institutions: Berkeley, Bryn Mawr, Howard, MIT, Princeton, Stanford, Texas A&M, UCSD, and UIUC (cf. http://cacm.acm.org/magazines/2011/2/104389-information-theory-after-shannon/fulltext).
In this talk, after briefly reviewing main results of Shannon, (cf. http://cacm.acm.org/magazines/2011/2/104389-information-theory-after-shannon/fulltext) we attempt to identify some features of information encompassing structural, spatio-temporal, and semantic facets of information. We present two new results: One on a fundamental lower bound for structural compression and a novel algorithm achieving this lower bound for graphical structures. Second, on the problem of deinterleaving Markov processes over disjoint finite alphabets, which have been randomly interleaved by a finite-memory switch.
Biography: Wojciech Szpankowski is Saul Rosen Professor of Computer Science and (by courtesy) Electrical and Computer Engineering at Purdue University where he teaches and conducts research in analysis of algorithms, information theory, bioinformatics, analytic combinatorics, random structures, and stability problems of distributed systems. He received his M.S. and Ph.D. degrees in Electrical and Computer Engineering from Gdansk University of Technology. He held several Visiting Professor/Scholar positions, including McGill University, INRIA, France, Stanford, Hewlett-Packard Labs, Universite de Versailles, University of Canterbury, New Zealand, Ecole Polytechnique, France, and the Newton Institute, Cambridge, UK. He is a Fellow of IEEE, and the Erskine Fellow. In 2010 he received the Humboldt Research Award. In 2001 he published the book Average Case Analysis of Algorithms on Sequences, John Wiley & Sons, 2001. He has been a guest editor and an editor of technical journals, including Theoretical Computer Science, the ACM Transaction on Algorithms, the IEEE Transactions on Information Theory, Foundation and Trendsin Communications and Information Theory, Combinatorics, Probability, and Computing, and Algorithmica. In 2008 he launched the interdisciplinary Institute for Science of Information, and in 2010 he became the Director of the newly established NSFScience and Technology Center for Science of Information.