| Abstract |
I will present a generalized framework for structured sparsity.
This framework extends the well-known methods of Lasso and
Group Lasso by incorporating additional constraints on
the variables as part of a convex optimization problem.
This provides a straightforward way of favoring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others.
Available optimization methods for such problems tend to not scale well with sample size and dimensionality. We instead
propose a novel first order proximal method, which builds upon
results on fixed points and successive approximations.
The algorithm relies on a proximity operator subproblem which
can be computed numerically.
Experiments on different regression problems
demonstrate the efficiency of the optimization algorithm,
its scalability with the size of the problem and
state of the art statistical performance, which improves over Lasso and
StructOMP. |