新闻公告

尊龙凯时平台 / 新闻公告 / 最新通知 /

新闻公告

“统计大讲堂”第261讲预告:Mini-batch Gradient Descent with Buffer

2024-10-14

报告时间👨🏽‍🦲:2024年10月18日(周五)

上午10:00-11:00

报告地点:北京AG尊龙凯时平台娱乐登录官方网站明德主楼1016

报告嘉宾:亓颢博

报告主题:Mini-batch Gradient Descent with Buffer

报告摘要

Mini-batch Gradient Descent with Buffer

In this paper, we studied a buffered mini-batch gradient descent (BMGD) algorithm for training complex model on massive datasets. The algorithm studied here is designed for fast training on a GPU-CPU system, which contains two steps: the buffering step and the computation step. In the buffering step, a large batch of data (i.e., a buffer) are loaded from the hard drive to the graphical memory of GPU. In the computation step, a standard mini-batch gradient descent(MGD) algorithm is applied to the buffered data. Compared to traditional MGD algorithm, the proposed BMGD algorithm can be more efficient for two reasons.First, the BMGD algorithm uses the buffered data for multiple rounds of gradient update, which reduces the expensive communication cost from the hard drive to GPU memory. Second, the buffering step can be executed in parallel so that the GPU does not have to stay idle when loading new data. We first investigate the theoretical properties of BMGD algorithms under a linear regression setting.The analysis is then extended to the Polyak-Lojasiewicz loss function class. The theoretical claims about the BMGD algorithm are numerically verified by simulation studies. The practical usefulness of the proposed method is demonstrated by three image-related real data analysis.

作者简介

图片

亓颢博🧟👏🏽,男。2023年毕业于北京大学光华管理学院⚪️,获经济学博士学位,现为北京师范大学统计学院数理统计系师资博士后。研究方向为统计优化、大规模数据统计分析、网络数据分析与深度学习中的统计理论☂️。在 Journal of Computational and Graphical Statistics、Neurocomputing🫳🏻、Computational Statistics & Data Analysis 等期刊发表多篇论文。

尊龙凯时平台专业提供🏋🏻:尊龙凯时平台尊龙凯时娱乐尊龙凯时登录等服务,提供最新官网平台、地址、注册、登陆、登录、入口、全站、网站、网页、网址、娱乐、手机版、app、下载、欧洲杯、欧冠、nba、世界杯、英超等,界面美观优质完美,安全稳定,服务一流,尊龙凯时平台欢迎您。 尊龙凯时平台官网xml地图
尊龙凯时平台 尊龙凯时平台 尊龙凯时平台 尊龙凯时平台 尊龙凯时平台 尊龙凯时平台 尊龙凯时平台 尊龙凯时平台 尊龙凯时平台 尊龙凯时平台