Presented By O'Reilly and Cloudera
Make Data Work
Feb 17–20, 2015 • San Jose, CA
Anima Anandkumar

Anima Anandkumar
Faculty member, UC Irvine


Anima Anandkumar is a faculty at the EECS Dept. at U.C.Irvine
since August 2010. Her research interests are in the area of
large-scale machine learning and high-dimensional statistics. She
received her B.Tech in Electrical Engineering from IIT Madras in 2004
and her PhD from Cornell University in 2009. She has been a visiting
faculty at Microsoft Research New England in 2012 and a postdoctoral
researcher at MIT between 2009-2010. She is the recipient of the Alfred.P. Sloan
Fellowship, Microsoft Faculty Fellowship, ARO Young Investigator
Award, NSF CAREER Award, IBM Fran Allen PhD fellowship, thesis award
from ACM SIGMETRICS society, and paper awards from the ACM SIGMETRICS
and IEEE Signal Processing societies.


9:00am–5:00pm Wednesday, 02/18/2015
Hardcore Data Science
Location: LL20 BC
Ben Lorica (O'Reilly), Ben Recht (University of California, Berkeley), Chris Re (Stanford University | Apple), Maya Gupta (Google), Alyosha Efros (UC Berkeley), Eamonn Keogh (University of California - Riverside), John Myles White (Facebook), Fei-Fei Li (Stanford University), Tara Sainath (Google), Michael Jordan (UC Berkeley), Anima Anandkumar (UC Irvine), John Canny (UC Berkeley), David Andrzejewski (Sumo Logic)
Average rating: ****.
(4.86, 7 ratings)
All-Day: Strata's regular data science track has great talks with real world experience from leading edge speakers. But we didn't just stop there—we added the Hardcore Data Science day to give you a chance to go even deeper. The Hardcore day will add new techniques and technologies to your data science toolbox, shared by leading data science practitioners from startups, industry, consulting... Read more.
1:30pm–2:00pm Wednesday, 02/18/2015
Hardcore Data Science
Location: LL20 BC.
Anima Anandkumar (UC Irvine)
Average rating: ****.
(4.00, 6 ratings)
I will demonstrate how to exploit tensor methods for learning. Tensors are higher order generalizations of matrices, and are useful for representing rich information structures. Tensor factorization involves finding a compact representation of the tensor using simple linear and multilinear algebra. Read more.