Song Han is a rising fifth-year PhD student at Stanford University under Bill Dally. Song’s research interests are deep learning and computer architecture; he is currently focused on improving the accuracy and efficiency of neural networks on mobile and embedded systems. Song has worked on deep compression that can compress state-of-the art CNNs by 10x–49x and compress SqueezeNet to only 470KB, which fits fully in on-chip SRAM. He proposed a DSD training flow that improved that accuracy of a wide range of neural networks and designed the EIE accelerator, an ASIC that works on the compressed model, which is 13x faster and 3000x energy efficient than TitanX GPU. Song’s work has been covered by The Next Platform, TechEmergence, Embedded Vision, and O’Reilly. His work on deep compression won the best paper award at ICLR ’16.
©2016, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com