Sergei Kozlukov

Hi. My name's Serge Kozlukov. I'm a mathematician pretending to be a data scientist at the joint Masters program of Skoltech and Higher School of Economics. Graduated in applied mathematics at Voronezh State University, Russia, supervised by Anatoly Baskakov.

So far I lived in a rather linear, normed, and complete world. More often finite-dimensional than not. I had a touch of linear analysis, optimal control, and variational calculus. Now I'm seeking for ways to comprehend at least the containing manifold in its wholeness. I'd also like to transit from our overconstrained universe to that of distributions.

Me elsewhere:

Log

July 2019-present: Research, software engineering, devops at in3D.io

At in3D we're building 3D avatars of humans using simple iPhone X depth camera requiring minimal user interaction: you basically place yourself in front of camera, say "start", turn around, and in couple minutes you get textured 3D mesh with precision allowing taking measurements for personalized cloth manufacturing.

I'm supposedly a researcher, but I'm having different sorts of fun here:

  • Actively refactoring
  • Collecting data
  • Mungling dockers and microservices
  • Occasionally getting dirty with backend
  • But also reading up on illumination models and inverse graphics
  • Taking part in reconstruction pipeline
  • Planning to work on texturing more closely

Also, here I met Dmitry Ulyanov, Vadim Lebedev, Nicholai Chinaev, Ilya Krotov, Bulat Yakupov, and Vsevolod Poletaev.

2018-2020: M. Sc. in CS/Data Science, HSE, Skoltech

Passing Statistical Learning Theory network program of Higher School of Economics and Skoltech.

  • Also having a tremendous amount of innovative leadership and enterpreneurship lectures and games... A tiny bit more than I would prefer to have

  • (TODO:) Lempitsky's DL. Same team + Kirill Mazur + Maria Taktasheva. Hyperbolic convolutions, hyperbolic batchnorm -- for images.

  • (TODO:) Burnaev's ML. Not my favorite. Same team + Rasul Karimov. Hyperbolic neural networks.

  • (TODO:) About same time the second term started, and with second term came Oseledets' NLA. Not my favorite either. Same team. The project was on reproducing spectral normalization paper.

  • In November 2018 I started working with Thibaut Le Gouic on optimal transportation, curvature in metric spaces, and barycentres. The motivation for me was that it seemed like the proper language to speak probability and statistic and also one that could be connected to physics, which supposedly should be fun. Main reference for what was planned is https://arxiv.org/abs/1806.02740. I'm very thankful to Thibaut Le Gouic and Quentin Paris for their help in this direction, even though my priorities changed soon and we couldn't interact too much. Recalling now, I think it was this short collaboration that resulted in our idea to describe batch normalization on manifolds using barycentres of measures.

  • In October, I think, we had a Large-Scale Optimization 1-month course by Yury Maximov which apparently included subjects (and homeworks...) for at least three months. All courses at Skoltech include result in a final project, and this one was no exception. This was the time the core of the team for the rest of MSc started forming: Maxim Kochurov came with idea that optimizing arbitrary stuff isn't enough fun, while it'd be awesome to e.g. reproduce results of some paper from last NeurIPS (NIPS back then). Those who felt same way included: Natasha Bobrovskaya, Maria Kolos, Pavel Kaloshin, Marina Pominova, Max Kochurov, me. We picked Accelerating Natural Gradient with Higher-Order Invariance: it seemed fun and math-y. Quite soon we found out that even though I'm supposed to be a mathematician, I never had differential geometry (although I had tried reading Spivak and Li before and knew some basic notions, and tried reading Henri Cartan's differential forms which aren't about manifolds, but are rather relatable and build the framework). I instantly got curious about Amari's work and Information Geometry, and couldn't but get back to Li, which I then spent several months on and still haven't finished.

Spring-Summer 2018: R3DS

(TODO:) just a couple months with really cool guys, C++, Eigen, and B-Splines.

2014-2018: B. sc. in applied math, Voronez SU

One year at mechanics department, then one year at applied mathematics&informatics, finally two years at Nonlinear Dynamics (nowadays dubbed "Optimal control and systems management" or something) under supervision of Anatoly Baskakov.

  • Studying the method of similar operators

    Aside from what I'd call a mere noise, finished two papers (one in Russian) concerning application of method of similar operators to estimation of spectral properties of perturbed matrices of special structure:

    Presented on some local conferences

    I'm not of great opinion about these articles -- neither about their value nor the style -- but it was a start. I'm considering though to try and make some iterative numerical method of it, just don't feel like it could have convergence rates competitive to existing methods. Might try one day.

    Topic also can be further developed

    • by applying the split theorem to refine the estimates
    • by generalizing to Banach spaces https://faculty.skoltech.ru/people/victorlempitsky Both seem trivial and not exactly profitable.

Комментарии

Comments powered by Disqus