Welcome!


I graduated from Caltech with a PhD in Electrical Engineering. My advisor was Professor Babak Hassibi. I was a postdoc in UC Berkeley during 2014-2015. My sponsor was Ben Recht. I was also lucky to be a Simons fellow during Spring 2015. I was a software engineer at Google until recently. I am currently a member of research staff at the Voleon Group.

My research lies at the crossroads of statistical machine learning, high-dimensional statistics and mathematical optimization. I borrow tools from probability and optimization theory to solve fundamental problems in signal processing and machine learning to develop next generation techniques for big data processing. My work involves proposing and analyzing fast and reliable algorithms with provable guarantees. Particular research directions include convex/nonconvex optimization for machine learning, sparse and low-rank approximation techniques, deep learning theory, and dimensionality reduction techniques. For my resume click here.

Samet Oymak

Email: sametoymak at gmail dot com
Samet Oymak

Selected research topics

I am broadly interested in problems from machine learning, signal processing and high dimensional statistics. My recent works include the following topics.

    Sharp Time-Data Tradeoffs for Linear Inverse Problems: We present a unified convergence analysis of the gradient projection algorithm applied to inverse problems. We sharply characterize the convergence rate associated with a wide variety of random measurement ensembles in terms of the number of samples and structural complexity of the signal.
    • S. Oymak, B. Recht, and M. Soltanolkotabi, ``Sharp Time-Data Tradeoffs for Linear Inverse Problems,'' under revision Nov 2016.
    • S. Oymak, and M. Soltanolkotabi, ``Fast and Reliable Parameter Estimation from Nonlinear Observations,'' manuscript Oct. 2016.

        Universality laws for dimension reduction: Dimension reduction is the process of embedding high-dimensional data into a lower dimensional space to facilitate its analysis. In the Euclidean setting, one fundamental technique for dimension reduction is to apply a random linear map to the data. This dimension reduction procedure succeeds when it preserves certain geometric features of the set. We study a natural family of randomized dimension reduction maps and a large class of data sets and prove that there is a phase transition in the success probability of the dimension reduction map as the embedding dimension increases.
        • S. Oymak, and J. Tropp, ``Universality laws for randomized dimension reduction, with applications,'' manuscript Dec 2015.

            Sketching any set with RIP matrices: We show that for the purposes of dimensionality reduction certain class of structured random matrices behave similarly to random Gaussian matrices. This class includes several matrices for which matrix-vector multiply can be computed in log-linear time, providing efficient dimensionality reduction of general sets.
            • S. Oymak, B. Recht, and M. Soltanolkotabi, ``Isometric sketching of any set via the Restricted Isometry Property,'' under revision Dec 2016.

              Simultaneously structured models: In several applications, the signal of interest has several structures at the same time. Examples include, quadratic compressed sensing, sparse PCA, low-rank tensor completion and estimating sparse and smooth signals (fused lasso). We show a weakness result for multi-objective convex penalizations that aims to recover and/or estimate such signals. The recently updated paper has simpler failure conditions that apply to a wide-range of measurement ensembles under much less assumptions.
              • S. Oymak, A. Jalali, M. Fazel, Y. C. Eldar, and B. Hassibi, ``Simultaneously Structured Models with Application to Sparse and Low-rank Matrices,'' Trans. on Info. Theory, PDF.

                Precise graph clustering: Graph clustering problem arises in community detection and social networking setups. Given basic statistics of the graph, can we deduce whether we can identify (planted) clusters? We develop simple but precise performance formulae for convex optimization based clustering techniques. These bounds closely resemble the Donoho-Tanner L1-minimization phase transitions.
                • R. Korlakai Vinayak, S. Oymak, and B. Hassibi, ``Sharp Performance Bounds for Graph Clustering via Convex Optimization,'' ICASSP 2014, full report.
                  • R. Korlakai Vinayak, S. Oymak, and B. Hassibi, ``Graph Clustering With Missing Data: Convex Algorithms and Analysis.,'' NIPS 2014.

                    Recent manuscripts


                  • S. Oymak, and M. Soltanolkotabi, ``Fast and Reliable Parameter Estimation from Nonlinear Observations,'' manuscript Oct. 2016.

                  • S. Oymak, ``Near-optimal sample complexity bounds for circulant binary embedding,'' Feb. 2016 short version to appear at ICASSP 2017.

                  • S. Oymak, B. Recht, and M. Soltanolkotabi, ``Sharp Time-Data Tradeoffs for Linear Inverse Problems,'' under revision Nov 2016.

                  • K. Jaganathan, S. Oymak, and B. Hassibi, ``Sparse Phase Retrieval: Uniqueness Guarantees and Recovery Algorithms,'' accepted to Transactions on Signal Processing Aug. 2016, arXiv:1311.2745.

                  • S. Oymak, and J. Tropp, ``Universality laws for randomized dimension reduction, with applications,'' manuscript Dec 2015.

                  • S. Oymak, B. Recht, and M. Soltanolkotabi, ``Isometric sketching of any set via the Restricted Isometry Property,'' under revision Dec 2016.

                  • S. Oymak and Ben Recht, ``Near optimal bounds for binary embeddings of arbitrary sets,'' manuscript Dec 2015.

                  • X. Pan, D. Papailiopoulos, S. Oymak, B. Recht, K. Ramchandran, & M. I. Jordan ``Parallel Correlation Clustering on Big Graphs,'' NIPS 2015.

                  • C. Thrampoulidis, S. Oymak, and B. Hassibi, ``Regularized Linear Regression: A Precise Analysis of the Estimation Error,'' COLT 2015.

                    Selected publications

                    For the complete list of publications, please check out here.

                    High-dimensional statistics

                    • S. Oymak, B. Recht, and M. Soltanolkotabi, ``Sharp Time-Data Tradeoffs for Linear Inverse Problems,'' under revision Nov 2016.
                    • C. Thrampoulidis, S. Oymak, and B. Hassibi, ``Regularized Linear Regression: A Precise Analysis of the Estimation Error,'' COLT 2015.
                    • C. Thrampoulidis, S. Oymak, and B. Hassibi, ``Simple Error Bounds for Regularized Noisy Linear Inverse Problems,'' ISIT 2014, arXiv:1401.6578.
                    • S. Oymak, C. Thrampoulidis, and B. Hassibi, ``The Squared-Error of Generalized LASSO: A Precise Analysis,'' in submission, short version appeared at Allerton 2013, arXiv:1311.0830, code.
                    • S. Oymak, A. Jalali, M. Fazel, Y. C. Eldar, and B. Hassibi, ``Simultaneously Structured Models with Application to Sparse and Low-rank Matrices,'' Trans. on Info. Theory, arXiv:1212.3753.
                    • S. Oymak and B. Hassibi, ``Sharp MSE Bounds for Proximal Denoising,'' Foundations of Computational Mathematics Oct 2015, partial results appeared in Allerton 2012, arXiv:1305.2714.
                    • S. Oymak, A. Khajehnejad and B. Hassibi, ``Recovery Threshold for Optimal Weight $\ell_1$ Minimization,'' International Symposium on Information Theory (ISIT) 2012, Cambridge, MA, PDF.
                    • S. Oymak, K. Mohan, M. Fazel, and B. Hassibi, ``A Simplified Approach to Recovery Conditions for Low Rank Matrices,'' ISIT 2011, St. Petersburg, Russia, arXiv:1103.1178.

                      Dimensionality reduction and Embedding techniques

                      • S. Oymak, and J. Tropp, ``Universality laws for randomized dimension reduction, with applications,'' manuscript Dec 2015.
                      • S. Oymak, B. Recht, and M. Soltanolkotabi, ``Isometric sketching of any set via the Restricted Isometry Property,'' under revision Dec 2016.
                      • S. Oymak and B. Hassibi, ``The proportional mean decomposition: A bridge between the Gaussian and bernoulli ensembles'' ICASSP 2015.
                      • S. Oymak and Ben Recht, ``Near optimal bounds for binary embeddings of arbitrary sets,'' manuscript Dec 2015.
                      • S. Oymak, A. Khajehnejad, and B. Hassibi, ``Subspace Expanders and Matrix Rank Minimization,'' ISIT 2011, PDF.
                      • S. Oymak and B. Hassibi, ``New Null Space Results and Recovery Thresholds for Matrix Rank Minimization,'' partial results appeared in ISIT 2011, technical report arXiv:1011.6326.

                        Graph processing

                        • X. Pan, D. Papailiopoulos, S. Oymak, B. Recht, K. Ramchandran, & M. I. Jordan ``Parallel Correlation Clustering on Big Graphs,'' NIPS 2015.
                          • R. Korlakai Vinayak, S. Oymak, and B. Hassibi, ``Graph Clustering With Missing Data: Convex Algorithms and Analysis.,'' NIPS 2014.
                          • S. Oymak and B. Hassibi, ``Finding Dense Clusters via Low Rank + Sparse Decomposition,'' related results appeared at ICASSP 2014, NIPS 2014 technical report.

                            Nonlinear models / Phase retrieval

                            • K. Jaganathan, S. Oymak, and B. Hassibi, ``Sparse Phase Retrieval: Uniqueness Guarantees and Recovery Algorithms,'' accepted to Transactions on Signal Processing Aug. 2016, arXiv:1311.2745.
                            • K. Jaganathan, S. Oymak, and B. Hassibi, ``Sparse Phase Retrieval: Convex Algorithms and Limitations,'' appeared in ISIT 2013, arXiv:1303.4128, code.
                            • K. Jaganathan, S. Oymak, and B. Hassibi, ``Recovery of Sparse 1-D Signals from the Magnitudes of their Fourier Transform,'' ISIT 2012, arXiv:1206.1405, code.
                            • K. Jaganathan, S. Oymak, and B. Hassibi, ``Phase Retrieval for Sparse Signals Using Rank Minimization,'' ICASSP 2012, PDF.
                              My thesis can be found here. Finally, you can click here for some personal facts.