This page requires that JavaScript be enabled in your browser.
Learn how »
类神经网路的原理介绍以及其在 Wolfram 语言里的使用
Mike Yeh 博士 (Wolfram 技术工程师)
我们会介绍深度学习中的类神经网路的基本原理,包括基本结构、激励函数(activation function)、优化算法的概念以及梯度下降法、批归一化、池化层,还有其他重要的技术和原理。并且会介绍对应的组件在 Wolfram 语言中的使用方法以及应用。
Thanks for your feedback.
Channels: Technology Conference
1311 videos match your search.
|
Eric Mjolsness Collaborative projects have resulted in several Mathematica-implemented modeling languages aimed at general-purpose biological modeling, which is a useful and topical but an indefinitely expandable goal. We update previous work on ... |
|
Jae Bum Jung/Yan Zhuang |
|
Phillip Todd |
|
Василий Сороко |
|
Phil Ramsden |
|
Lou D'Andria Constructing interfaces with Dynamic, DynamicModule and Manipulate is nothing new, but those aren't the only Dynamic primitives available in Mathematica. In this talk, we'll identify and demonstrate some of the ... |
|
Галина Михалкина, Григорий Фридман |
|
Галина Михалкина |
|
Андрей Кротких |
|
Антон Екименко, Кирилл Белов |
|
Физический институт имени П.Н. Лебедева |
|
Григорий Фридман, Олег Иванов |
|
Галина Михалкина |
|
Олег Кофнов |
|
Николай Сосновский |
|
Микаэл Эгибян |
|
Микаэл Эгибян |
|
Леонид Шифрин |
|
Вахагн Геворгян |
|
Алексей Семенов |