CSCE 791 : Seminar on Advances in Computing

Fridays 2:30pm to 3:20pm (300 S. Main, B102)

Expectations for the Course

I expect a brief write-up from each of you after each talk. Summarize the content of the talk and describe briefly what you got out of it. Also include any questions you may have asked the speaker, if only you had thought of them at the time. This is a brief write-up. Don't go over a page. A paragraph or two should suffice.

Upload your write-up to the CSE Dropbox for the appropriate week.


Date Speaker Topic
15 Jan Stephen Fenner
Computer Science and Engineering
University of South Carolina
Title: A few remarks about the course

Abstract: I will briefly describe the course and its expectations.

(Self) Reference:

Bio: Stephen Fenner is Associate Professor of Computer Science and Engineering at the University of South Carolina. He earned a B.A. (you saw that right – a B.A.) in Physics from Harvard-Radcliffe College in 1983, and an M.S. (1988) and Ph.D. (1991) in Computer Science from the University of Chicago. His research interests are mostly in theoretical computer science, including computational complexity theory and the theory of quantum information processing. He is also interested in cryptography, security, and the other various connections between computers and mathematics.

22 Jan Stephen Fenner
Computer Science and Engineering
University of South Carolina
Title: Entanglement, Teleportation, and Quantum Channels

Abstract: About a year ago, Matthew Hastings discovered superadditive quantum channels, resolving a long-standing open question in quantum information theory. Such channels can transmit more information when used in tandem than the combined amount of information that each channel can send individually. The key to superadditivity is that inputs to the channels are quantum states which are entangled, i.e., correlated in a uniquely quantum way. Quantum entanglement is a valuable resource for quantum information processing. I will introduce the notion of quantum entanglement and give some examples of its use in quantum information and computation, particularly, quantum teleporation. If time permits, I will also discuss how quantum channels can be superadditive.

This is an extended reprise of a talk given last Fall in the same seminar series.

29 Jan Song Wang
Computer Science and Engineering
University of South Carolina
Title: Shape Matching and Classification: Algorithms and Performance Evaluation

In this talk, I will go over some recent advances on 2D shape matching and classification, which has important applications in computer vision, image processing, and pattern recognition. With these new advances made by different research groups, we can more robustly quantify the shape similarity under various noise, occlusions and nonrigid shape deformation. At the end, I will particularly introduce two new perceptually motivated strategies we recently developed for further improving shape matching and classification. I will also briefly introduce the MPEG7 shape benchmark and the associated bull-eye testing that are widely used for evaluating and comparing the performance of the different shape matching and classification algorithms.

5 Feb John Rose
Computer Science and Engineering
University of South Carolina
Title: De Novo Peptide Sequencing: Informatics and Pattern Recognition applied to Proteomics

Abstract: Tandem mass spectrometry (MS/MS) is the engine driving high-throughput protein identification. The samples from which the data are derived may contain complex mixtures of thousands of proteins. Moreover, in environmental samples proteins may derive from multiple species. These protein mixtures (with or without prior separation) are treated with proteolytic enzymes, cutting the proteins into smaller peptides of size manageable by current MS/MS technology. The peptides are then analyzed generating MS/MS spectra. The task of determining the identity of the peptide from its spectrum is currently the weak point in the process. Current approaches to de novo sequencing are able to compute candidate peptides efficiently. The problem lies in the limitations of current scoring functions. In this presentation we introduce the concept of proteome signature. By examining proteins and compiling proteome signatures (amino acid usage) it is possible to characterize likely combinations of amino acids and better distinguish between candidate peptides. Our results strongly support the hypothesis that a scoring function that considers amino acid usage patterns is better able to distinguish between candidate peptides. This in turn leads to higher accuracy in peptide prediction.

12 Feb Derek A. Edwards
ManTech International Corporation
Title: What doesn't work in information security

Bio: Derek Edwards has been a Computer Forensics Intrusion Analyst with ManTech International for the past eight years, where he has served as an incident handler at the U.S. Departments of Justice and State. At State, he contributed to the team that won the 2004 Frank B. Rowlett Trophy for Organizational Achievement, presented by the National Security Agency.

After work, Derek enjoys sailing, SCUBA diving, and just about anything else that involves sun, sea, and sand in a warm climate. Derek has been a computer programmer since age 12, so he continues to enjoy pursuits in computing, including publishing his church's website and video podcast.

19 Feb José M. Vidal
Computer Science and Engineering
University of South Carolina
Title: Multiagent Systems

Abstract: In this informal talk I will go over some of my current research projects in collaborative knowledge systems, supply chains, and agent-based traffic models.

26 Feb Andrew M. Odlyzko
School of Mathematics
and
The Digital Technology Center
University of Minnesota
Title: How to live and prosper with insecure cyberinfrastructure

Location: Amoco Hall, Swearingen Building

Abstract: Mathematics has contributed immensely to the development of secure cryptosystems and protocols. Yet our networks are terribly insecure, and we are constantly threatened with the prospect of imminent doom. Furthermore, even though such warnings have been common for the last two decades, the situation has not gotten any better. On the other hand, there have not been any great disasters either. To understand this paradox, we need to consider not just the technology, but also the economics, sociology, and psychology of security. Any technology that requires care from millions of people, most very unsophisticated in technical issues, will be limited in its effectiveness by what those people are willing and able to do. This imposes strong limits on what formal mathematical methods can accomplish, and suggests that we will have to put up with the equivalent of baling wire and chewing gum, and to live on the edge of intolerable frustration.

Please note: This lecture is the second of two given by Dr. Odlyzko for the USC Phi Beta Kappa Visiting Scholar Program. The first is titled, Technology manias: From railroads to the Internet and beyond, and will be given on Thursday, February 25 at 3:30pm in Amoco Hall.

Bio: Andrew Odlyzko is a Professor in the School of Mathematics at the University of Minnesota. He is engaged in a variety of projects, from mathematics to security and Internet traffic monitoring. His main task currently is to write a book that compares the Internet bubble to the British Railway Mania of the 1840s, and explores the implications for future of technology diffusion.

Between 2001 and 2008, he also was at various times the founding director of the interdisciplinary Digital Technology Center, Interim Director of the Minnesota Supercomputing Institute, Assistant Vice President for Research, and held an ADC Professorship, all at the University of Minnesota. Before moving to Minneapolis in 2001, he devoted 26 years to research and research management at Bell Telephone Laboratories, AT&T Bell Labs, and AT&T Labs, as that organization evolved and changed its name.

He has written over 150 technical papers in computational complexity, cryptography, number theory, combinatorics, coding theory, analysis, probability theory, and related fields, and has three patents. He has an honorary doctorate from Univ. Marne la Vallee and serves on editorial boards of over 20 technical journals, as well as on several advisory and supervisory bodies.

He has managed projects in diverse areas, such as security, formal verification methods, parallel and distributed computation, and auction technology. In recent years he has also been working on electronic publishing, electronic commerce, and economics of data networks, and is the author of such widely cited papers as "Tragic loss or good riddance: The impending demise of traditional scholarly journals," "The bumpy road of electronic commerce," "Paris Metro Pricing for the Internet," "Content is not king," and "The history of communications and its implications for the Internet." He may be known best for an early debunking of the myth of Internet traffic doubling every three or four months and for demonstrating that connectivity has traditionally mattered much more for society than content.

5 Mar Jianjun Hu
Computer Science and Engineering
University of South Carolina
Title: Application of Machine Learning and Data Mining in Bioinformatics of Protein Sorting Signals and Drug Design

Abstract: Machine Learning and Data Mining techniques are widely applied to all scientific and engineering research fields. In this talk, I will present our research in applying these techniques including classification, global optimization, and motif discovery to solving protein sorting signals identification, protein localization prediction, computational drug design, and engineering design. These cases will set up a good starting point for the audience to formulate their own data mining and machine learning problems and to try to solve it either independently or by collaborating with us. This talk also aims to highlight the importance of these techniques which you can learn by enrolling in our Data Mining (CSCE822) or Machine Learning (CSCE883) Courses. This talk is our commitment to distributing our techniques and ideas to the public as part of our effort for the research funded by NSF CAREER award. More information can be accessed at our lab homepage http://mleg.cse.sc.edu.

12 Mar (Spring Break) (No seminar)
19 Mar Srihari Nelakuditi
Computer Science and Engineering Department
University of South Carolina
Title: Breaking Away from Collision Avoidance: Towards Collision Detection in Wireless Networks

Abstract: Wireless networks are founded on the principles of collision avoidance. This talk presents an attempt to detect and abort collisions in wireless networks. Briefly, the receiver uses physical layer information to detect a collision, and immediately notifies the transmitter to abort the transmission. The collision notification consists of a unique signature, sent on the same frequency channel as the data. The transmitter uses a second listener antenna to discern this notification through signature correlation. The transmitter aborts, freeing the channel for other productive transmissions. A prototype testbed of 10 USRP/GNURadios demonstrates the feasibility and effectiveness of our approach.

Bio: Srihari Nelakuditi received his Ph.D. from University of Minnesota, Minneapolis in Computer Science. He is currently an Associate Professor at University of South Carolina, Columbia. Srihari Nelakuditi is a recipient of the NSF CAREER Award in 2005. His current research focus is on making wired and wireless networks more resilient to failures/disruptions and more efficient in utilizing network resources.

26 Mar John Rogers
College of Computing and Digital Media
DePaul University
Title: Constructive Logic

Abstract: Mathematical logic is one of the foundations of computer science and a thorough understanding of that foundation is necessary to carrying out research in many areas. Most in CS are familiar with classical propositional logic, where the truth value of a formula is determined by the truth values assigned to the atomic propositions. But there are other logics, one could argue, that more closely follow the approach CS uses to tackle problems. One such is constructive logic, also known as intuitionistic logic. In classical logic, showing that a formula is valid is done by showing that it's true under all truth assignments. In constructive logic, showing that a formula is valid means providing a proof.

For example, classically we say that either P = NP or that P ≠ NP. But because we don't have a proof of either assertion, constructive logic maintains that the disjunction is not valid. In fact, no formula of the form α ∨ ¬α is valid. This notion of requiring a proof and then constructing proofs of compound formulas is defined for each of the connectives. The idea can be carried over to the typed lambda calculus and from there to strongly-typed functional languages like ML and Haskell.

In this talk, I will present the fundamentals of constructive propositional logic, discuss its application to CS, and end with an open problem.

Bio: John Rogers is currently an Associate Professor of Computer Science in the College of Computing and Digital Media at DePaul University, where he has been since earning his Ph.D. in 1995 at the University of Chicago. His main area of interest is computational complexity theory and, from that perspective, has investigated aspects of quantum computing, applications of Kolmogorov complexity, poset game theory, and various non-classical logics. He is the publicity chair for the annual Conference of Computational Complexity and chief organizer of the semi-annual Midwest Theory Day.

2 Apr Jason Bakos
Computer Science and Engineering Department
University of South Carolina
Title: Heterogeneous Computing: New Directions for Efficient and Scalable High Performance Computing

Abstract: Until recently, Moore's Law, coupled with advancements in computer architecture, has allowed the performance achieved with legacy software to continue to improve. However, current-generation microprocessors can no longer sustain growth in their per-thread performance, and in response there has been a fundamental shift towards scaling up the number of individual processor cores per CPU. However, each of these individual processor cores are still designed with the premise of extracting as much performance as possible from code that is oblivious to the underlying micro architecture, requiring a large portion of their real estate to be allocated to control logic and cache as opposed to computational units.

Heterogeneous computing, which is the technique of combining special-purpose processors with general-purpose processors within the same computer system, offers the potential to boost the performance hundreds or thousands of times more efficiently than by simply scaling up the number of general-purpose processors. However, making this technique accessible to application programmers will require a new generation of specialized development tools and design methodologies that don't yet exist. In this talk, I will present an overview of our work in heterogeneous computing, as well as provide a peek into what the future may hold for this field.

9 Apr Chin-Tser Huang
Computer Science and Engineering Department
University of South Carolina
Title: Authentication and Privacy in WiMAX Networks

Abstract: Previous wireless networking technologies operated under a limited range, such as the popular WiFi, or at a limited speed, such as cellular networks. The IEEE 802.16 standard, more commonly known as WiMAX, is poised to overcome these restrictions and bring broadband access to much more households with its ability to transmit to a range of up to 30 miles at speeds as high as 75 Mbps. This represents a highly efficient solution as it allows an entire geographical area, such as a metropolitan city or rural town, to be effectively covered by a few base stations without individually laying high-speed cable to each building. Moreover, WiMAX supports two other significant functionalities in mobility and multicasting, which make it even more powerful and convenient to use. However, as the interest in and adoption of WiMAX rapidly increase, its security also becomes a growing concern, especially on the issues of authentication and privacy.

In this talk, we will consider the authentication and privacy in WiMAX networks with three major components. First, we address the need of sufficient and efficient authentication and privacy for WiMAX networks by presenting several attack scenarios and discussing major schemes that provide authentication and privacy in wired networks. Second, we overview the security sublayer in the IEEE 802.16 standard, in particular its Privacy and Key Management protocols (PKM, and PKMv2 in the latest IEEE 802.16e), and reveal their vulnerabilities in face of several types of attacks. Third, we propose solutions to fix the aforementioned vulnerabilities of PKM and PKMv2, and present novel protocols aimed at providing authentication and privacy for multicasting and roaming situations in WiMAX networks. Evaluation results show that these new protocols perform more efficiently than the current standard.

16 Apr Duncan Buell
Computer Science and Engineering Department
University of South Carolina
Title: Topics in Digital Humanities

Abstract: I will be on sabbatical in the fall semester of 2010, spending time on the other end of campus with the Center for Digital Humanities. In this (very) informal talk I will go over some of the problems that are currently being proposed in the digital humanities as well as some that might be proposable. The humanities are, to a great extent, just now entering the digital world, in some contrast to the science and technology disciplines. For this reason, many opportunities exist for significant changes in scholarship in the humanities.

23 Apr Marcus Shaefer
College of Computing and Digital Media
DePaul University
Title: The Real Logic of Drawing Graphs

Abstract: We study the complexity of several geometric graph drawing problems, including the recognition problem for intersection graphs of convex sets and the realizability of linkages. It turns out that the complexity of these problems is intricately tied to the existential theory of the real numbers. We compare this to known results in the literature, including results on the rectilinear crossing number, the Steinitz problem, and Nash equilibria, and argue that there is a need to introduce and study new classes to capture this type of complexity.