We are a computer science research group led by Ce Zhang with help from many close collaborators and friends.

Members Publications

One-pager Research Statement

Master Thesis: Spring 2018

We are offering a series of master thesis in the spring semester of 2018. Drop us an email at ce.zhang@inf.ethz.ch if you are interested. Some sample topics:

Applications: data-driven astrophysics, meteorology, social sciences, proteomics, reinforcement learning for telescope array control, self-driving etc.

Systems: scalable deep learning over a thousand GPUs, model compression for deep learning, in-database machine learning, blockchain and cryptocurrency as general computation platform, system control with predictive models, etc.

Machine Learning: foundations behind system relaxation: decentralized learning, low precision communication, asynchrony; multi-task learning etc.

News

SemEval 2018

  • Nora Hollenstein & Jonathan Rotsztejn‘s system ranked first in the relation classification subtask among 28 international teams in SemEval 2018 (Task 7 Subtask 1) ! Their system also ranks top 1 and top 2 for two other relation extraction subtasks (Task 7 Subtask 2).

VLDB 2018

  • Ease.ml: Towards Multi-tenant Resource Sharing for Machine Learning Workloads.

AISTATS 2018

  • Prof. Heng Guo & Kaan Kara: Layerwise Systematic Scan: Deep Boltzmann Machines and Beyond.

EDBT 2018

  • Demjan Grubic: Synchronous Multi-GPU Training for Deep Learning with Low-Precision Communications: An Empirical Study

NIPS 2017

  • Oral Presentation: Xiangru Lian and Prof. Ji Liu on decentralized learning [paper].
  • A 3-minutes teaser video: 

Science

  • space.ml is featured in a News article in the Science magazine [link].
  • GalaxyGAN is selected as the Editor’s Choice in the Science magazine [link].

VLDB 2017 (Munich Aug 28 – Sep 1)

  • Come by our two talks: Lele Yu on building Bayesian Inference as a new service with hundreds of machines; Zhipeng Zhang on a comparative study on different SimRank algorithms [paper]. 
  • Also don’t miss the demo session: Xupeng Li on ease.ml version 1 — declarative in-database machine learning with a cute homomorphism between relational algebra and linear algebra [paper].

ICML 2017 (Sydney Aug 6 – Aug 11)

  • Hantian Zhang is going to give a talk about ZipML — low precision machine learning on modern hardware [paper].

SIGMOD 2017 (Chicago May 14 – May 19)

  • Jiawei Jiang gave the talk about a distributed machine learning system designed for heterogeneous infrastructure where straggler is expected [paper].
  • HILDA: Come by to hear our vision about ease.ml — Deep Learning in four lines to serve ETH scientists [paper].

Machine Learning on Modern Hardware

  • Kaan Kara: training linear models on FPGA with low precision [FCCM paper]
  • Ewaida Mohsen: Xgboost inference on FPGA that can deal with up to 20M tuples per second [FPL]!

space.ml (with Kevin Schawinski) gets covered by Science (Editor’s Choice), the Atlantic, and WIRED Science.
   
     

An ETH Globe article about DS3Lab.
image-imageformat-twocolumn-1470572383

Ce gives talks at ETH Meets New York and his Inaugural Lecture at ETH.
c_-t8kbuaaac6it  dbtk5hewsaawolr

Hantian and Dan give the ZipML session at NVIDIA GTC 2017
gtc-2015-logo