只接受发布货源信息,不可发布违法信息,一旦发现永久封号,欢迎向我们举报!
免费发布信息
18货源网 > 餐饮行业新闻资讯 > 鞋子资讯 >  莆田鞋高端版本是什么意思,大家都在说的XGBoost到底是什么?


  • 【莆田鞋厂家分类】
  • 【奢侈大牌包包厂家分类】
  • 【潮牌奢侈服饰鞋子厂家分类】
  • 【名表厂家分类】

厂家货源分类区域

莆田鞋高端版本是什么意思,大家都在说的XGBoost到底是什么?

发布时间:2021-03-23 20:19:46  来源:网络整理   浏览:   【】【】【

莆田鞋高端版本是什么意思,大家都在说的XGBoost到底是什么?

最近在补充一些机器学习的背景知识,看到的体系性的东西不多,而涉及的概念又特别多,经常会碰到各种名词缩写。很多同学,在概念体系上其实也是一知半解,所以基本也很难把你讲明白。这里记录一下,我如何理解一个新概念的。

做一件事,预设目标蛮重要的。我的目标不是去做算法的实现,所以不需要搞明白每个算法的具体数学原理和实现步骤,而是概念上建立一个认知,并且逐渐完善算法方面的知识体系。这样,理解的时候暂时不要纠结于细节。待体系完善后,再逐个层次深入理解,可参考"金字塔原理"。

操作的方法主要是通过google或者bing检索相关关键字,顺图索骥,优先看wiki:

  • XGBoost

From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". Other than running on a single machine, it also supports the distributed processing frameworks Apache Hadoop, Apache Spark, and Apache Flink.

  • Gradient Boosting

Gradient boosting is a machine learning technique for regression and classification problems, which produces a prediction model in the form of an ensemble of weak prediction models, typically decision trees. It builds the model in a stage-wise fashion like other boosting methods do, and it generalizes them by allowing optimization of an arbitrary differentiable loss function.

Gradient Boosting is also a boosting algorithm(Duh!), hence it also tries to create a strong learner from an ensemble of weak learners. This is algorithm is similar to Adaptive Boosting(AdaBoost) but differs from it on certain aspects. In this method we try to visualise the boosting problem as an optimisation problem, i.e we take up a loss function and try to optimise it. This idea was first developed by Leo Breiman.

  • Boosting

"Can a set of weak learners create a single strong learner?" A weak learner is defined to be a classifier that is only slightly correlated with the true classification (it can label examples better than random guessing). In contrast, a strong learner is a classifier that is arbitrarily well-correlated with the true classification.

When first introduced, the hypothesis boosting problem simply referred to the process of turning a weak learner into a strong learner. "Informally, [the hypothesis boosting] problem asks whether an efficient learning algorithm […] that outputs a hypothesis whose performance is only slightly better than random guessing [i.e. a weak learner] implies the existence of an efficient algorithm that outputs a hypothesis of arbitrary accuracy [i.e. a strong learner]."[3] Algorithms that achieve hypothesis boosting quickly became simply known as "boosting". Freund and Schapire's arcing (Adapt[at]ive Resampling and Combining),[7] as a general technique, is more or less synonymous with boosting.[8]

我关心的直接概念是XGBoost,这个概念又引申出Gradient Boosting,进而又引申出Boosting。顺着这个脉络,概念的关系就很明显并且很容易理解了。

一个设想 -> 一种算法实现 -> 一个框架

  • 设想命名

从一组弱学习器逐渐提升进化到一个强学习器,提升预测的准确度。这个过程重点是逐渐提升,起名boosting容易理解了。

  • 算法

基于这个思想衍生出来的boosting算法实现有很多,如:AdaBoost,GradientBoost

  • 框架

算法的实现到实用和通用需要框架的支持,框架可以做很多方面的事情:

底层:屏蔽对底层依赖,实现各种跨平台支持,对应用透明,如:单机、分布式环境等

自身:优化算法效果、提升性能、降低资源消耗等

周边:开发者友好的多语言支持、开发工具、调试工具、部署工具等

比较知名的框架:XGBoost,LightGBM

建议您加vx:1064879863 她会给你推荐可靠有实力的莆田鞋本地厂家vx。


工厂主要面向实体批发和微商代理一件代发,免费加入轻松微商相册传图销售,无论您是做代理还是自己买来穿都是超值超好的货源。



责任编辑:
© 18货源网