Abstracts2014

Local Computation Algorithms and Local Mechanism Design

by Shai Vardi (TAU)

The talk is divided into two parts. In the first part I will give an introduction to Local Computation Algorithms (LCAs). LCAs implement query access to a global solution to computational problems, using polylogarithmic time and space. I will also discuss how to construct LCAs using a reduction from online algorithms.
In the second part I will explore Local Mechanism Design - designing truthful mechanisms that run in polylogarithmic time and space. I will focus on local scheduling algorithms.

The talk is based on joint works with Noga Alon, Yishay Mansour, Ronitt Rubinfeld, Aviad Rubinstein and Ning Xie.


Worst-Case Coalitions in Routing Games

by Gideon Blocq (Technion)

Recently, there has been a growing interest in network scenarios where selfish agents are able to communicate, bargain and form coalitions, i.e., play a cooperative game, in order to overcome the inefficiency of noncooperative equilibria. We study such coalitional games under worst-case conditions in the fundamental load-balancing setting of routing over parallel links. Specifically, we define the cost of a coalition as the outcome of a Stackelberg game in which a rival leader aims to maximize the cost of the coalition. Quite surprisingly, we establish that the leader acts as if it is a continuum of infinitesimal self-optimizing agents. With this structural result at hand, we investigate fundamental solution concepts of the considered cooperative game, namely the Nash Bargaining Scheme, the (Inner) Core and the min-max fair Nucleolus. Most notably, we show that they all bring about the system optimum.

Joint work with Prof. Ariel Orda


Garden Hose model - from Position-Based Quantum Cryptography to SAT Solver

by Oded Margalit (IBM)

Recently, Harry Buhrman et al introduced a novel communication complexity model, called "garden-hose".
This model sprout from a research on using quantum properties to allow for position based cryptography.
SAT is one of the fundamental NP-Complete problem - finding a satisfying assignment to a CNF (conjunctive Normal Form) formula, or proving that none exists.
We will not get into the details of the quantum physics, neither we are going to explain the internal work of SAT solver.
Instead we will start from the mathematical garden hose model; describe the way we used SAT solver as a tool; give some lower and upper bounds on implementing the equality function; and conclude with open questions.


Matroid Secretary for Regular and Decomposable Matroids

by Michael Dinitz (Weizmann IOT)

In the matroid secretary problem we are given a stream of elements and asked to choose a set of elements that maximizes the total value of the set, subject to being an independent set of a matroid given in advance. The difficulty comes from the assumption that decisions are irrevocable: if we choose to accept an element when it is presented by the stream then we can never get rid of it, and if we choose not to accept it then we cannot later add it. Babaioff, Immorlica, and Kleinberg [SODA 2007] introduced this problem and conjectured that every matroid admits an O(1)-competitive algorithm. While many classes of matroids are known to admit O(1)-competitive algorithms (e.g. graphic, laminar, and transversal matroids), a fundamental class of matroids is still open: vector matroids. We take a first step in this direction by giving an O(1)-competitive algorithms for regular matroids, the class of matroids that are representable as vector spaces over any field. Moreover, unlike much previous work our techniques are fundamentally matroid-theoretic rather than graph-theoretic.

Joint work with Guy Kortsarz.


Braess's Paradox in Wireless Networks: The Danger of Improved Technology

by Merav Parter (Weizmann IOT)

When comparing new wireless technologies, it is common to consider the effect that they have on the capacity of the network (defined as the maximum number of simultaneously satisfiable links). For example, it has been shown that giving receivers the ability to do interference cancellation, or allowing transmitters to use power control, never decreases the capacity and can in certain cases increase it by $\Omega(\log (\Delta \cdot P_{\max}))$, where $\Delta$ is the ratio of the longest link length to the smallest transmitter-receiver distance and $P_{\max}$ is the maximum transmission power. But there is no reason to expect the optimal capacity to be realized in practice, particularly since maximizing the capacity is known to be NP-hard. In reality, we would expect links to behave as self-interested agents, and thus when introducing a new technology it makes more sense to compare the values reached at game-theoretic equilibria than the optimum values.

In this paper we initiate this line of work by comparing various notions of equilibria (particularly Nash equilibria and no-regret behavior) when using a supposedly ‘`better" technology. We show a version of Braess’s Paradox for all of them: in certain networks, upgrading technology can actually make the equilibria \emph{worse}, despite an increase in the capacity. We construct instances where this decrease is a constant factor for power control, interference cancellation, and improvements in the SINR threshold ($\beta$), and is $\Omega(\log \Delta)$ when power control is combined with interference cancellation. However, we show that these examples are basically tight: the decrease is at most $O(1)$ for power control, interference cancellation, and improved $\beta$, and is at most $O(\log \Delta)$ when power control is combined with interference cancellation.

Joint work with Michael Dinitz.


The Hub Labeling Algorithm

by Andrew V. Goldberg (Principal Researcher, Microsoft Research Silicon Valley)

Given a weighted graph, a distance oracle takes as an input a pair of vertices and returns the distance between them. The labeling approach to distance oracle design is to precompute a label for every vertex so that the distances can be computed from the corresponding labels, without looking at the graph. We survey results on hub labeling (HL), a labeling algorithm that received a lot of attention recently in both theoretical and experimental contexts.

HL query time and memory requirements depend on the label size. While some graphs admit small labels, one can prove that there are graphs for which the labels must be large. Computing optimal hub labels is hard, but in polynomial time one can approximate them up to a factor of O(log(n)) [Cohen at al. 2003]. This can be done for the total label size (i.e., memory required to store the labels), the maximum label size (which determines the worst-case query time), and in general for an Lp norm of the vector induced by the vertex label sizes. One can also simultaneously approximate Lp and Lq norms. One can improve the complexity of the original Cohen at al. algorithm to O(n^3 log n), making the algorithm practical for moderate-size network.

Hierarchical labels are a special class of HL. For networks with small highway dimension, one can compute provably small hierarchical labels in polynomial time. On the other hand, one can prove that for some graphs hierarchical labels are significantly larger than the general ones. A heuristic for computing hierarchical labels leads to the fastest implementation of distance oracles for road networks and can handle continental-size networks. One can use label compression to trade off time for space, making the algorithm practical for a wider range of applications. We give experimental results showing that the heuristic hierarchical labels work well on road networks as well as some other graph classes, but not on all graphs. We also discuss efficient implementations of the provably good approximation algorithms and give experimental results.


Welfare Maximization and the Supermodular Degree.

by Rani Izsak (Weizmann IOT)

Given a set of items and a collection of players, each with a nonnegative monotone valuation set function over the items, the welfare maximization problem requires that every item be allocated to exactly one player, and one wishes to maximize the sum of values obtained by the players, as computed by applying the respective valuation function to the bundle of items allocated to the player. This problem in its full generality is $\NP$-hard, and moreover, at least as hard to approximate as set packing. Better approximation guarantees are known for restricted classes of valuation functions. In this work we introduce a new parameter, the {\em supermodular degree} of a valuation function, which is a measure for the extent to which the function exhibits supermodular behavior. We design an approximation algorithm for the welfare maximization problem whose approximation guarantee is linear in the supermodular degree of the underlying valuation functions.

Joint work with Uriel Feige.


Display advertising with third parties

by Omri Weinstein (Princeton)

Display advertising is the most popular internet advertising model used on the web. In the typical setup,
an auctioneer (e.g, the Ad exchange) needs to sell a single unknown item, to a bunch of potential advertisers,
using the second-price mechanism. Previous literature studied the problem of revenue maximization in this setup,
showing that the auctioneer may increase her revenue by "bundling" items and running separate auctions for each
bundle. This approach implicitly assumes that the auctioneer has complete information about the next item (context)
to be sold. In reality, however, "most" relevant information is actually provided by many "third party" mediators,
each of which having some private information about the item (context) which is about to be sold.
We introduce a generalized model for display advertising which captures such information structure, and study the
revenue maximization problem in the presence of mediators, in both strategic and non-strategic settings. We propose
a mechanism for soliciting information from third-party mediators, inspired by the Shapley value distribution, and
give a tight analysis of the price of anarchy of this mechanism. In particular, we show that it is always beneficial
for the auctioneer to run this mechanism, as using it will never decease her revenue. We also show this mechanism
admits a pure Nash equilibrium, via a simple proof which surprisingly extends to general "Shapley" games and, to best
of our knowledge, was not previously known. In the non-strategic setup, we consider the the problem where the auctioneer
can control the information sets reported by each of the [m] mediators, and needs to aggregate this information in a
distributed manner so as to maximize her revenue. We show this problem is hard to approximate even to within a factor of
\tilde{O}(m^{1/4}), via a gap-preserving reduction from the densest-k-subgraph problem, and give a 2-approximation for
a special and natural class of mediators.

Joint work with Moshe Tennenholtz and Moran Feldman.


Sparse Principle Component Analysis: Dealing with high-dimensional data

by Danny Vilenchik (Weizmann IOT)

Our society invests massively in the collection and processing of data of
all kinds, on scales unimaginable until recently. In the classical setting,
one collected many observations on a small set of carefully chosen
variables of interest. Modern problems have in many cases a different
flavor. The phenomenon one tries to explain may depend perhaps on a few
variables, but a-priori one doesn't know which variables to measure. This
opens the door to high-dimensional problems where the number of variables
is on the order of (or larger than) the number of observations. The
high-dimensional setting raises two main concerns. (1) Many of the
classical statistical methods break (2) The computational complexity of
solving a problem usually depends on the problem's dimension. When the
dimension is huge, this aspect cannot be ignored.

In this talk we focus on a classical method for dimension reduction called
Principle Component Analysis (PCA). We are interested in the
high-dimensional setting with sparsity assumptions. We describe several
algorithmic methods to solve sparse PCA: from the computationally
light-weight greedy algorithm to the heavy-duty Semi-definite programming
approach.

Joint works with Robert Krauthgamer and Boaz Nadler


New Approaches to Graph Partitioning

by Roy Schwartz (Technion)

Graph partitioning problems capture some of the most basic combinatorial optimization problems, and are motivated by a wide range of applications, emerging both in theory and practice.
Partitioning problems have been studied extensively in the last few decades, exhibiting beautiful connections to metric geometry, functional analysis, probability theory, and computational complexity.
In this talk we discuss novel techniques to partitioning problems by considering two fundamental examples: Non-Uniform Partitioning and Multiway Cut.
For the former we obtain the first provable guarantee while for the latter we simplify and improve the result of Karger et. al [STOC99].
Our new approaches to these problems employ a combination of combinatorial, stochastic and geometric techniques.


Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License