Yoram Singer - BOOM: BOOsting with Momentum Technion Computer Engineering Lecture

submitted by aitbroadcast on 08/04/14 1

Yoram Singer of Google, "BOOM: BOOsting with Momentum" Technion Computer Engineering Center 2013 conference lecture. In the talk we review one of the largest machine learning platforms at Google called Sibyl. Sibyl can handle over 100B examples in 100B dimensions so long as each example is very sparse. The recent version of Sibyl fuses Nesterov's accelerated gradient method with parallel boosting. The result is an algorithm that retains the momentum and convergence properties of the accelerated gradient method while taking into account the curvature of the objective function. The algorithm, termed BOOM, is fast to convergence, supports any smooth convex loss function, and is easy to parallelize. We conclude with a few examples of problems at Google that the system handles.

Leave a comment

Be the first to comment

Email
Message
×
Embed video on a website or blog
Width
px
Height
px
×
Join Huzzaz
Start collecting all your favorite videos
×
Log in
Join Huzzaz

facebook login
×
Retrieve username and password
Name
Enter your email address to retrieve your username and password
(Check your spam folder if you don't find it in your inbox)

×