Constantine Caramanis Photo 

    Constantine Caramanis

    Professor, Dept. of Electrical and Computer Engineering

    Chandra Family Endowed Distinguished Professorship in Electrical and Computer Engineering

    Member of the Computer Science Graduate Studies Committee

    Office: 2501 Speedway, EER Building Room 6.820

    e-mail: constantine at utexas.edu



I am a Professor in the ECE department of The University of Texas at Austin. I received a PhD in EECS from The Massachusetts Institute of Technology, in the Laboratory for Information and Decision Systems (LIDS), and an AB in Mathematics from Harvard University. I received the NSF CAREER award in 2011, and I am an IEEE Fellow.


My current research interests focus on decision-making in large-scale complex systems, with a focus on learning and computation. Specifically, I am interested in robust and adaptable optimization, high dimensional statistics and machine learning, and applications to large-scale networks, including social networks, wireless networks, transportation networks, and energy networks. I have also worked on applications of machine learning and optimization to computer-aided design.


I am affiliated with the NSF Institute for Foundations of Machine Learning, and the Machine Learning Lab.

I am affiliated with the Archimedes Research Center in Athens.


Teaching

I have also created three classes which I have made available online.


Research Group

Current Group

Group Alumni


The last ten publications, chronologically…

  1. Rout, Litu, Yujia Chen, Nataniel Ruiz, Constantine Caramanis, Sanjay Shakkottai, and Wen-Sheng Chu. “Semantic Image Inversion and Editing Using Rectified Stochastic Differential Equations.” Proceedings of the International Conference on Learning Representations (ICLR), 2025.
  2. Raoof, Negin, Litu Rout, Giannis Daras, Sujay Sanghavi, Constantine Caramanis, Sanjay Shakkottai, and Alex Dimakis. “Infilling Score: A Pretraining Data Detection Algorithm for Large Language Models.” Proceedings of the International Conference on Learning Representations (ICLR), 2025.
  3. Rout, Litu, Yujia Chen, Nataniel Ruiz, Abhishek Kumar, Constantine Caramanis, Sanjay Shakkottai, and Wen-Sheng Chu. “RB-Modulation: Training-Free Personalization of Diffusion Models Using Stochastic Optimal Control.” Proceedings of the International Conference on Learning Representations (ICLR), 2025.
  4. Rout, Litu, Yujia Chen, Abhishek Kumar, Constantine Caramanis, Sanjay Shakkottai, and Wen-Sheng Chu. “Beyond First-Order Tweedie: Solving Inverse Problems Using Latent Diffusion.” Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR), 2024.
  5. Zhuo, Jiacheng, Jeongyeol Kwon, Nhat Ho, and Constantine Caramanis. “On the Computational and Statistical Complexity of over-Parameterized Matrix Sensing.” Journal of Machine Learning Research, 2024.
  6. Atsidakou, Alexia, Constantine Caramanis, Evangelia Gergatsouli, Orestis Papadigenopoulos, and Christos Tzamos. “Contextual Pandora’s Box.” Association for the Advancement of Artificial Intelligence (AAAI), 2024.
  7. Tsikouras, Nikos, Constantine Caramanis, and Christos Tzamos. “Optimization Can Learn Johnson Lindenstrauss Embeddings.” Advances in Neural Information Processing Systems (NeurIPS), 2024.
  8. Kwon, Jeongyeol, Yonathan Efroni, Shie Mannor, and Constantine Caramanis. “RL in Latent MDPs Is Tractable: Online Guarantees via Off-Policy Evaluation.” Advances in Neural Information Processing Systems (NeurIPS), 2024.
  9. ———. “Prospective Side Information for Latent MDPs.” In International Conference on Machine Learning (ICML). PMLR, 2024.
  10. Rout, Litu, Advait Parulekar, Constantine Caramanis, and Sanjay Shakkottai. “A Theoretical Justification for Image Inpainting Using Denoising Diffusion Probabilistic Models.” Preprint, 2023.