Algorithmic game theory is a relatively new field connecting between computer science
and economics. A main driving force behind this field was the Internet. Suddenly, there
was a need to efficiently handle large scale complex economic markets (ad auctions
and spectrum auctions are well known examples). Algorithmic game theory addresses
this need by combining the vast experience of economics in designing markets with the
expertise of computer scientists in efficiently handling large computerized systems. The
scope of the research has broaden to include diverse topics such as algorithmic
mechanism design, price of anarchy, social choice, social networks and equilibrium
computation. Tools from algorithmic game theory were applied to the modeling and
analysis of various social phenomena (opinion formation is one example).
Algorithms is a core area of computer science, concerned with the design, analysis and
implementation of problem-solving methods. The foundations of any computer system,
whether it is a home desktop, a datacenter, the Internet, or a deep neural network,
relies on sound algorithmic ideas. The quantity of digital data continues to grow at an
exponential rate, which accentuate the need for faster and more accurate algorithms.
The goal of our research group is to provide scalable solutions to a wide array of
problems in various settings. We also study which algorithmic objectives are
impossible, by proving lower bounds.
Automata play a major role in theory of computation, compiler construction, artificial intelligence, natural language processing, and formal verification. This field of research studies the theory of automata on its multitude dimensions, its relation to logic, games, and online algorithms.
Coding theory, deals with the design and the study of the properties of error-correcting
codes. Error-correcting codes besides error-correction have many different applications
like data compression, cryptography, networking lower bounds. Codes are studied by
various scientific disciplines—such as information theory, electrical engineering,
mathematics, linguistics, and computer science.
Our research is focused on designing codes for interactive communication and on locally
decodable codes.
Complexity theory is an active research area that analyzes inherent hardness or difficulty of various computational problems.
It strives to a full classification of which problems are difficult and to what extent, in terms of running time, space, and other computational resources. The area is full of beautiful results, among which are the most foundational ones in Computer Science.
This includes the NP versus P question, and the PCP characterization. In addition, complexity theory is closely related to Cryptography. As a result many results in Complexity Theory have direct and immense practical bearings.
Modern cryptography provides algorithms and protocols for protecting honest parties
from distrusted or malicious parties that attempt to eavesdrop to communication or
modify it. It plays an increasingly dominant role in modern life due to the deployment of
new technologies that expose users to new potential threats.
Cryptography is a very broad field of research that combines both theory and practice.
Our research spans diverse sub-areas of cryptography such as secure multiparty
computation (which aims to develop protocols that allow parties to jointly compute a
function of their inputs while making sure the inputs remain private) and cryptanalysis
(whose goal is to devise methods to evaluate the security of ciphers, thus ensuring that
only the most secure ones are deployed).
Look at your smartphone. Do you think that the company that manufactured it is collecting information about you? Do you care?
Data driven systems are guiding decisions in many areas of society, and the way in which this is happening is increasingly complex. As a result, we are constantly confronting the need to balance the benefits of these systems with the various negative consequences that can come along with those, and in particular, with privacy issues. In the field of data privacy we study the possible ways for organizations to collect and analyze statistical data on individuals in a way that does not compromise their privacy.
In distributed algorithms we study problems modeled by a graph, whose vertices host
processors. These processors communicate with one another along communication links,
modeled by edges of the graph. Initially, each vertex knows only its own part of the graph, and
ultimately they need to solve some common task, such as to compute a minimum spanning tree,
or to color the graph with a few colors. The objective is to perform this efficiently, both in terms of running time and in terms of total communication. This is a very active area. It has rich mathematical theory and numerous open problems, and practical applications to the world of communication networks.
The main objective in the field of metric embedding is to understand how well can
arbitrary “complex” metric spaces be represented in “simpler” spaces. Typical simple
spaces can be a low-dimensional space, Euclidean space, a tree, a sparse graph, etc..
L ow-distortion embeddings provide a powerful and versatile toolkit for solving
algorithmic problems, applicable in a variety of settings, such as: approximation and
online algorithms, computational biology, machine learning, computer vision and many
others.
Quantum computers are computing devices that use quantum mechanics to their advantage. Quantum computers (if they are to be built) could solve certain computational problems much faster than classical computers. Fundamentally, the questions we are trying to answer are: what are the computational tasks for which quantum computers are good for? What types of tasks quantum computers are not good for?
The main aim in the field of theorem proving is to develop powerful mechanical reasoning systems (known as theorem provers) that enable mathematicians to prove theorems by computer programs, and computer scientists to verify that hardware and software designs meet their formal specifications. Nowadays, theorem provers play vital role in the modeling and reasoning about complex and large-scale systems, especially safety-critical systems.
Technically, mathematical formalisms and automated reasoning based-approaches are employed to perform inferences and to generate proofs in theorem provers. The field consists of two main approaches based on the level of human interaction involved in the proof search process: automated theorem proving, which includes methods such as SMT solvers and model checking, and interactive theorem provers, often called proof assistants.