# Raichel, Benjamin

Permanent URI for this collectionhttps://hdl.handle.net/10735.1/6182

Benjamin Raichel is an Assistant Professor of Computer Science. His research interests include:

- Algorithms and Theory
- Discrete and Computational Geometry
- Randomized and Approximation Algorithms

In 2018 the National Science Foundation awarded its CAREER Award for Dr. Raichel's project on geometric-based data anaysis.

## Browse

# Browsing Raichel, Benjamin by Issue Date

Now showing 1 - 3 of 3

- Results Per Page
- Sort Options

Item Geometric Packing Under Nonuniform Constraints(SIAM Publications, 2018-10-22) Ene, Alina; Har-Peled, Sariel; Raichel, Benjamin A.; Raichel, Benjamin A.We study the problem of discrete geometric packing. Here, given weighted regions (say, in the plane) and points (with capacities), one has to pick a maximum weight subset of the regions such that no point is covered more than its capacity. We provide a general framework and an algorithm for approximating the optimal solution for packing in hypergraphs arising out of such geometric settings. Using this framework we get a flotilla of results on this problem (and also on its dual, where one wants to pick a maximum weight subset of the points when the regions have capacities). For example, for the case of fat triangles of similar size, we show an O (1)-approximation and prove that no PTAS is possible.Item Viewing the Rings of a Tree: Minimum Distortion Embeddings into Trees(Association for Computing Machinery, 2019-01) Nayyeri, A.; Raichel, Benjamin; Raichel, BenjaminWe describe a (1 + ε) approximation algorithm for finding the minimum distortion embedding of an npoint metric space, (X, d_X), into a tree with vertex set X. The running time of our algorithm is n²•(∆/ε)^{(O(δopt/ε))2λ+1} parameterized with respect to the spread of X, denoted by ∆, the minimum possible distortion for embedding X into any tree, denoted by δ_{opt}, and the doubling dimension of X, denoted by λ. Hence we obtain a PTAS, provided δ_{opt} is a constant and X is a finite doubling metric space with polynomially bounded spread, for example, a point set with polynomially bounded spread in constant dimensional Euclidean space. Our algorithm implies a constant factor approximation with the same running time when Steiner vertices are allowed. Moreover, we describe a similar (1 + ε) approximation algorithm for finding a tree spanner of (X, d_X) that minimizes the maximum stretch. The running time of our algorithm stays the same, except that δoptmust be interpreted as the minimum stretch of any spanning tree of X. Finally, we generalize our tree spanner algorithm to a (1 + ε) approximation algorithm for computing a minimum stretch tree spanner of a weighted graph, where the running time is parameterized with respect to the maximum degree, in addition to the other parameters above. In particular, we obtain a PTAS for computing minimum stretch tree spanners of weighted graphs, with polynomially bounded spread, constant doubling dimension, and constant maximum degree, when a tree spanner with constant stretch exists. Copyright © 2019 by SIAM.Item Sparse Approximate Conic Hulls(Neural Information Processing Systems Foundation) Van Buskirk, Gregory; Raichel, Benjamin; Ruozzi, Nicholas; Van Buskirk, Gregory; Raichel, Benjamin; Ruozzi, NicholasWe consider the problem of computing a restricted nonnegative matrix factorization (NMF) of an m × n matrix X. Specifically, we seek a factorization X ≈ BC, where the k columns of B are a subset of those from X and C ∈ ℝ{^{k×n} _{≥0}} k×n. Equivalently, given the matrix X, consider the problem of finding a small subset, S, of the columns of X such that the conic hull of S ε-approximates the conic hull of the columns of X, i.e., the distance of every column of X to the conic hull of the columns of S should be at most an ε-fraction of the angular diameter of X. If k is the size of the smallest ε-approximation, then we produce an O(k/ε²/³) sized O (ε¹/³)-approximation, yielding the first provable, polynomial time ε-approximation for this class of NMF problems, where also desirably the approximation is independent of n and m. Furthermore, we prove an approximate conic Carathéodory theorem, a general sparsity result, that shows that any column of X can be ε-approximated with an O(1/ε²) sparse combination from S. Our results are facilitated by a reduction to the problem of approximating convex hulls, and we prove that both the convex and conic hull variants are d-SUM-hard, resolving an open problem. Finally, we provide experimental results for the convex and conic algorithms on a variety of feature selection tasks. © 2017 Neural information processing systems foundation. ©2017 Neural Information Processing Systems Foundation. All rights reserved.