Hatef Monajemi is a Data Science scholar at Stanford Data Science Initiative (SDSI), and a postdoctoral fellow in the department of Statistics. He received his PhD degree from Stanford University in 2016 in the area of sparse sensing under supervision of Prof. David Donoho. Hatef's research concerns massive computational expriments for scientific discovery, high-dimensional data analysis, and mathematical theories of machine learning algorithms. During his PhD, Hatef in collaboration with his PhD adviser David L. Donoho has developed and used ClusterJob (CJ) to conduct more than 40 million reproducible computational experiments in the cloud for his dessertation. He hopes that CJ can facilitate computational studies and data-driven discoveries in academia, and that other researchers will benefit from Clusterjob as much as he did. More information about Hatef can be found on his website.
David L. Donoho
David L. Donoho is a professor of statistics at Stanford University, where he is also the Anne T. and Robert M. Bass Professor in the Humanities and Sciences. His work includes the development of effective methods for the construction of low-dimensional representations for high-dimensional data problems (multiscale geometric analysis), developments of wavelets for denoising and compressed sensing. He is a pioneer of reproducible science publications and ambitious data science studies. For more information visit https://en.wikipedia.org/wiki/David_Donoho.
Bekk Blando is a software engineer and a senior undergraduate student at Clemson University studying Mathematics and Computer Science. Bekk is interested in technologies that can faciliate ambitious data science experiments in the cloud. He is a core developer of ClusterJob. For more info about Bekk visit his GitHub page.