Constantine Caramanis Photo 

    Constantine Caramanis

    Professor, Dept. of Electrical and Computer Engineering

    Chandra Family Endowed Distinguished Professorship in Electrical and Computer Engineering

    Member of the Computer Science Graduate Studies Committee

    Office: 2501 Speedway, EER Building Room 6.820

    e-mail: constantine at utexas.edu



I am a Professor in the ECE department of The University of Texas at Austin. I received a PhD in EECS from The Massachusetts Institute of Technology, in the Laboratory for Information and Decision Systems (LIDS), and an AB in Mathematics from Harvard University. I received the NSF CAREER award in 2011, and I am an IEEE Fellow.


My current research interests focus on decision-making in large-scale complex systems, with a focus on learning and computation. Specifically, I am interested in robust and adaptable optimization, high dimensional statistics and machine learning, and applications to large-scale networks, including social networks, wireless networks, transportation networks, and energy networks. I have also worked on applications of machine learning and optimization to computer-aided design.


I am affiliated with the NSF Institute for Foundations of Machine Learning, and the Machine Learning Lab.

I am affiliated with the Archimedes Research Center in Athens.


Teaching

I have also created two classes which I have made available online.


Research Group

Current Group

Group Alumni


The last ten publications, chronologically…

  1. Atsidakou, Alexia, Constantine Caramanis, Evangelia Gergatsouli, Orestis Papadigenopoulos, and Christos Tzamos. “Contextual Pandora’s Box.” Association for the Advancement of Artificial Intelligence (AAAI), 2024.
  2. Rout, Litu, Yujia Chen, Abhishek Kumar, Constantine Caramanis, Sanjay Shakkottai, and Wen-Sheng Chu. “Beyond First-Order Tweedie: Solving Inverse Problems Using Latent Diffusion.” Preprint, 2023.
  3. Rout, Litu, Advait Parulekar, Constantine Caramanis, and Sanjay Shakkottai. “A Theoretical Justification for Image Inpainting Using Denoising Diffusion Probabilistic Models.” Preprint, 2023.
  4. Faw, Matthew, Litu Rout, Constantine Caramanis, and Sanjay Shakkottai. “Beyond Uniform Smoothness: A Stopped Analysis of Adaptive SGD.” Conference on Learning Theory (COLT), 2023.
  5. Atsidakou, Alexia, Branislav Kveton, Sumeet Katariya, Constantine Caramanis, and Sujay Sanghavi. “Logarithmic Bayes Regret Bounds.” Advances in Neural Information Processing Systems (NeurIPS), 2023.
  6. Caramanis, Constantine, Dimitris Fotakis, Alkis Kalavasis, Vasilis Kontonis, and Christos Tzamos. “Optimizing Solution-Samplers for Combinatorial Problems: The Landscape of Policy-Gradient Methods.” Advances in Neural Information Processing Systems (NeurIPS), 2023.
  7. Rout, Litu, Negin Raoof, Giannis Daras, Constantine Caramanis, Alexandros Dimakis, and Sanjay Shakkottai. “Solving Linear Inverse Problems Provably via Posterior Sampling with Latent Diffusion Models.” Advances in Neural Information Processing Systems (NeurIPS), 2023.
  8. Kwon, Jeongyeol, Yonathan Efroni, Constantine Caramanis, and Shie Mannor. “Reward-Mixing MDPs with Few Latent Contexts Are Learnable.” In International Conference on Machine Learning (ICML), 18057–82. PMLR, 2023.
  9. Faw, Matthew, Isidoros Tziotis, Constantine Caramanis, Aryan Mokhtari, Sanjay Shakkottai, and Rachel Ward. “The Power of Adaptivity in SGD: Self-Tuning Step Sizes with Unbounded Gradients and Affine Variance.” The Conference on Learning Theory (COLT), 2022.
  10. Kwon, Jeongyeol, Yonathan Efroni, Constantine Caramanis, and Shie Mannor. “Coordinated Attacks against Contextual Bandits: Fundamental Limits and Defense Mechanisms.” International Conference on Machine Learning (ICML), 2022.