欢迎访问行业研究报告数据库

行业分类

当前位置:首页 > 报告详细信息

找到报告 1 篇 当前为第 1 页 共 1

基于分布式随机梯度下降法的蝴蝶混合沟通效率

Communication-Efficient Distributed Stochastic Gradient Descent with Butterfly Mixing
作者:Huasha Zhao 作者单位:Department of Electrical Engineering and Computer Sciences University of California, Berkeley 加工时间:2013-11-16 信息来源:EECS 索取原文[14 页]
关键词:随机梯度下降法;蝴蝶混合;机器学习;数据挖掘;顺序算法;
摘 要:Stochastic gradient descent is a widely used method to find locally-optimal models in machinelearning and data mining. However, it is naturally a sequential algorithm, and parallelizationinvolves severe compromises because the cost of synchronizing across a cluster is much larger than the time required to compute an optimal-sized gradient step. Here we explore butterfly mixing, where gradient steps are inter leaved with the k stages of a butterfly network on 2k nodes.Udp based butterfly mix steps should be extremely fast and failure-tolerant, and convergence is almost as fast as a full mix (All Reduce) on every step.
© 2016 武汉世讯达文化传播有限责任公司 版权所有 技术支持:武汉中网维优
客服中心

QQ咨询


点击这里给我发消息 客服员


电话咨询


027-87841330


微信公众号




展开客服