日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 人工智能 > pytorch >内容正文

pytorch

深度学习术语_您应该意识到这些(通用)深度学习术语和术语

發(fā)布時間:2023/12/15 pytorch 37 豆豆
生活随笔 收集整理的這篇文章主要介紹了 深度学习术语_您应该意识到这些(通用)深度学习术语和术语 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.

深度學(xué)習(xí)術(shù)語

術(shù)語 (Terminologies)

介紹 (Introduction)

I’ve recently gone through a set of machine learning-based projects presented in Juptyter notebook and have noticed that there are a set of recurring terms and terminologies in all notebooks and machine learning-based projects I’ve worked on or reviewed.

我最近瀏覽了Juptyter筆記本中介紹的一組基于機器學(xué)習(xí)的項目,并注意到在我從事或研究的所有筆記本和基于機器學(xué)習(xí)的項目中都有一組重復(fù)的術(shù)語和術(shù)語。

You can see this article as a way of cutting through some noise within machine learning and deep learning. Expect to find descriptions and explanations of terms and terminologies that you are bound to come across in the majority of deep learning-based projects.

您可以將本文視為消除機器學(xué)習(xí)和深度學(xué)習(xí)中的噪音的一種方式。 期望找到對大多數(shù)基于深度學(xué)習(xí)的項目必定會遇到的術(shù)語和術(shù)語的描述和解釋。

I cover the definition of terms and terminologies associated with the following subject areas in a machine learning project:

我將介紹與機器學(xué)習(xí)項目中以下主題領(lǐng)域相關(guān)的術(shù)語和術(shù)語的定義:

  • Datasets

    數(shù)據(jù)集

  • Convolutional Neural Network Architecture

    卷積神經(jīng)網(wǎng)絡(luò)架構(gòu)

  • Techniques

    技巧

  • Hyperparameters

    超參數(shù)

  • 1.數(shù)據(jù)集 (1. Datasets)

    Photo by Franki Chamaki on Unsplash照片由Franki Chamaki在Unsplash上拍攝

    Training Dataset: This is the group of our dataset used to train the neural network directly. Training data refers to the dataset partition exposed to the neural network during training.

    訓(xùn)練數(shù)據(jù)集 :這是我們的數(shù)據(jù)集,用于直接訓(xùn)練神經(jīng)網(wǎng)絡(luò)。 訓(xùn)練數(shù)據(jù)是指訓(xùn)練期間暴露于神經(jīng)網(wǎng)絡(luò)的數(shù)據(jù)集分區(qū)。

    Validation Dataset: This group of the dataset is utilized during training to assess the performance of the network at various iterations.

    驗證數(shù)據(jù)集 :在訓(xùn)練期間利用該組數(shù)據(jù)集來評估網(wǎng)絡(luò)在各種迭代中的性能。

    Test Dataset: This partition of the dataset evaluates the performance of our network after the completion of the training phase.

    測試數(shù)據(jù)集 :在訓(xùn)練階段完成后, 數(shù)據(jù)集的此分區(qū)評估了我們網(wǎng)絡(luò)的性能。

    2.卷積神經(jīng)網(wǎng)絡(luò) (2. Convolutional Neural Networks)

    Photo by Alina Grubnyak on Unsplash Alina Grubnyak在Unsplash上拍攝的照片

    Convolutional layer: A convolution is a mathematical term that describes a dot product multiplication between two sets of elements. Within deep learning, the convolution operation acts on the filters/kernels and image data array within the convolutional layer. Therefore a convolutional layer simply houses the convolution operation that occurs between the filters and the images passed through a convolutional neural network.

    卷積層 :卷積是一個數(shù)學(xué)術(shù)語,用于描述兩組元素之間的點積乘法。 在深度學(xué)習(xí)中,卷積操作作用于卷積層內(nèi)的濾鏡/內(nèi)核和圖像數(shù)據(jù)陣列。 因此,卷積層僅容納發(fā)生在濾波器和通過卷積神經(jīng)網(wǎng)絡(luò)的圖像之間發(fā)生的卷積運算。

    Batch Normalization layer: Batch Normalization is a technique that mitigates the effect of unstable gradients within a neural network through the introduction of an additional layer that performs operations on the inputs from the previous layer. The operations standardize and normalize the input values, after that the input values are transformed through scaling and shifting operations.

    批處理歸一化層 :批處理歸一化是一種技術(shù),它通過引入一個附加層來減輕神經(jīng)網(wǎng)絡(luò)中不穩(wěn)定梯度的影響,該附加層對來自上一層的輸入執(zhí)行操作。 在通過縮放和移位操作對輸入值進行轉(zhuǎn)換之后,這些操作將對輸入值進行標準化和標準化。

    MaxPooling layer: Max pooling is a variant of sub-sampling where the maximum pixel value of pixels that fall within the receptive field of a unit within a sub-sampling layer is taken as the output. The max-pooling operation below has a window of 2x2 and slides across the input data, outputting an average of the pixels within the receptive field of the kernel.

    MaxPooling層 :Max pooling是子采樣的一種變體,其中將屬于子采樣層內(nèi)某個單元的接收場內(nèi)的像素的最大像素值作為輸出。 下面的最大池化操作具有2x2的窗口,并在輸入數(shù)據(jù)上滑動,輸出內(nèi)核接受域內(nèi)像素的平均值。

    Flatten layer: Takes an input shape and flattens the input image data into a one-dimensional array.

    拼合層 :采用輸入形狀并將輸入圖像數(shù)據(jù)拼合為一維數(shù)組。

    Dense Layer: A dense layer has an embedded number of arbitrary units/neurons within. Each neuron is a perceptron.

    致密層 :致密層中嵌入了任意數(shù)量的單元/神經(jīng)元。 每個神經(jīng)元都是一個感知器。

    3.技術(shù) (3. Techniques)

    Photo by Markus Spiske on Unsplash Markus Spiske在Unsplash上拍攝的照片

    Activation Function: A mathematical operation that transforms the result or signals of neurons into a normalized output. The purpose of an activation function as a component of a neural network is to introduce non-linearity within the network. The inclusion of an activation function enables the neural network to have greater representational power and solve complex functions.

    激活函數(shù) :將神經(jīng)元的結(jié)果或信號轉(zhuǎn)換為標準化輸出的數(shù)學(xué)運算。 激活函數(shù)作為神經(jīng)網(wǎng)絡(luò)的組成部分的目的是在網(wǎng)絡(luò)內(nèi)引入非線性。 包含激活函數(shù)使神經(jīng)網(wǎng)絡(luò)具有更大的表示能力并能夠解決復(fù)雜的函數(shù)。

    Rectified Linear Unit Activation Function(ReLU): A type of activation function that transforms the value results of a neuron. The transformation imposed by ReLU on values from a neuron is represented by the formula y=max(0,x). The ReLU activation function clamps down any negative values from the neuron to 0, and positive values remain unchanged. The result of this mathematical transformation is utilized as the output of the current layer and used as input to a consecutive layer within a neural network.

    整流線性單位激活函數(shù)(ReLU) :一種激活函數(shù),可轉(zhuǎn)換神經(jīng)元的值結(jié)果。 ReLU對來自神經(jīng)元的值施加的變換由公式y = max(0,x)表示 。 ReLU激活功能將神經(jīng)元的任何負值鉗制為0,而正值保持不變。 該數(shù)學(xué)變換的結(jié)果被用作當前層的輸出,并被用作神經(jīng)網(wǎng)絡(luò)內(nèi)連續(xù)層的輸入。

    Softmax Activation Function: A type of activation function that is utilized to derive the probability distribution of a set of numbers within an input vector. The output of a softmax activation function is a vector in which its set of values represents the probability of an occurrence of a class or event. The values within the vector all add up to 1.

    Softmax激活函數(shù) :一種激活函數(shù),用于導(dǎo)出輸入向量內(nèi)一組數(shù)字的概率分布。 softmax激活函數(shù)的輸出是一個向量,其中的一組值表示發(fā)生類或事件的概率。 向量中的值總計為1。

    Dropout: Dropout technique works by randomly reducing the number of interconnecting neurons within a neural network. At every training step, each neuron has a chance of being left out, or rather, dropped out of the collated contributions from connected neurons.

    輟學(xué) 輟學(xué)技術(shù)通過隨機減少神經(jīng)網(wǎng)絡(luò)中互連神經(jīng)元的數(shù)量來工作。 在每個訓(xùn)練步驟中,每個神經(jīng)元都有機會被遺漏,或更確切地說,會從連接的神經(jīng)元的整理貢獻中消失。

    4.超參數(shù) (4. Hyperparameters)

    Marko Bla?evi? on 馬爾科布拉澤維奇對UnsplashUnsplash

    Loss function: A method that quantifies ‘how well’ a machine learning model performs. The quantification is an output(cost) based on a set of inputs, which are referred to as parameter values. The parameter values are used to estimate a prediction, and the ‘loss’ is the difference between the predictions and the actual values.

    損失函數(shù) :一種方法,量化機器“ 如何 ”學(xué)習(xí)模型執(zhí)行。 量化是基于一組輸入的輸出(成本),稱為參數(shù)值。 參數(shù)值用于估計預(yù)測,而“損失”是預(yù)測與實際值之間的差。

    Optimization Algorithm: An optimizer within a neural network is an algorithmic implementation that facilitates the process of gradient descent within a neural network by minimizing the loss values provided via the loss function. To reduce the loss, it is paramount the values of the weights within the network are selected appropriately.

    優(yōu)化算法 :神經(jīng)網(wǎng)絡(luò)內(nèi)的優(yōu)化器是一種算法實現(xiàn),可通過最小化通過損失函數(shù)提供的損失值來促進神經(jīng)網(wǎng)絡(luò)內(nèi)的梯度下降過程。 為了減少損失,最重要的是適當選擇網(wǎng)絡(luò)內(nèi)的權(quán)重值。

    Learning Rate: An integral component of a neural network implementation detail as it’s a factor value that determines the level of updates that are made to the values of the weights of the network. Learning rate is a type of hyperparameter.

    學(xué)習(xí)率 :神經(jīng)網(wǎng)絡(luò)實現(xiàn)細節(jié)的組成部分,因為它是決定對網(wǎng)絡(luò)權(quán)重值進行更新的級別的因子值。 學(xué)習(xí)率是一種超參數(shù)。

    Epoch: This is a numeric value that indicates the number of time a network has been exposed to all the data points within a training dataset.

    時代:這是一個數(shù)字值,表示網(wǎng)絡(luò)暴露于訓(xùn)練數(shù)據(jù)集中所有數(shù)據(jù)點的時間。

    結(jié)論 (Conclusion)

    There are obviously tons more terms and terminologies that you are bound to come across as you undertake and complete machine learning projects.

    在您完成并完成機器學(xué)習(xí)項目時,顯然會有更多的術(shù)語和術(shù)語。

    In future articles, I’ll probably expand on more complex concepts within machine learning that appear frequently.

    在以后的文章中,我可能會擴展機器學(xué)習(xí)中經(jīng)常出現(xiàn)的更復(fù)雜的概念。

    Feel free to save the article or share it with machine learning practitioners who are at the start of their learning journey or career.

    隨意保存文章或與處于學(xué)習(xí)旅程或職業(yè)開始的機器學(xué)習(xí)從業(yè)者分享。

    我希望您覺得這篇文章有用。 (I hope you found the article useful.)

    To connect with me or find more content similar to this article, do the following:

    要與我聯(lián)系或查找更多類似于本文的內(nèi)容,請執(zhí)行以下操作:

  • Subscribe to my email list for weekly newsletters

    訂閱我的電子郵件列表 每周通訊

  • Follow me on Medium

    跟我來

  • Connect and reach me on LinkedIn

    LinkedIn上聯(lián)系并聯(lián)系我

  • 翻譯自: https://towardsdatascience.com/you-should-be-aware-of-these-common-deep-learning-terms-and-terminologies-26e0522fb88b

    深度學(xué)習(xí)術(shù)語

    總結(jié)

    以上是生活随笔為你收集整理的深度学习术语_您应该意识到这些(通用)深度学习术语和术语的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網(wǎng)站內(nèi)容還不錯,歡迎將生活随笔推薦給好友。

    主站蜘蛛池模板: 亚洲免费网站 | 波多野结衣一区二区三区在线观看 | 国产一级二级三级在线观看 | 日韩精品在线免费观看视频 | 国产成人亚洲精品无码h在线 | a免费观看 | 天天干天天操天天干 | 精品久久久久亚洲 | 久久久久久久久久久综合 | 欧美色涩在线第一页 | 欧洲精品一区二区三区 | 果冻传媒18禁免费视频 | 中文字幕有码无码人妻av蜜桃 | 日韩电影一区 | 亚洲а∨天堂久久精品2021 | 国产精品一区二区三区在线免费观看 | 青青草网站 | 国产婷婷色一区二区 | 中国美女黄色一级片 | 99国产精品久久久久99打野战 | 国产香蕉视频在线 | 有码中文字幕 | 法国空姐电影在线观看 | 极品超粉嫩尤物69xx | 老司机黄色影院 | missav | 免费高清av在线看 | 日本天堂网在线观看 | 国产精品伦 | 欧美日韩一区二区在线 | 国产伦精品一区二区三区视频孕妇 | eeuss一区二区 | 中文字幕无码乱码人妻日韩精品 | 老司机免费视频 | 亚洲区小说区图片区qvod | 成人福利av | 视频福利在线 | 日韩欧美一级二级 | 成人a区| 国产91在线观看丝袜 | 亚洲性激情 | 国产久久精品 | 国产精品suv一区二区 | 啪网址 | 美女超碰在线 | 奇米777第四色 | 亚洲20p | 午夜视频在线免费看 | 国产乱视频 | 日本黄色高清视频 | 亚洲免费视频一区二区 | 欧美一区二区日韩一区二区 | 成人福利院 | 人人妻人人澡人人爽人人精品 | 日日摸天天添天天添破 | 麻豆乱码国产一区二区三区 | 超碰牛牛 | 日韩欧美一二三四区 | 91麻豆网| 国产又黄又猛的视频 | 国产成人精品免费在线观看 | 无码成人精品区在线观看 | 操批网站 | 久久久久亚洲av片无码 | 麻豆精品免费视频 | 国产精品乱码久久久久久久久 | 婷婷毛片 | a少妇| 亚洲精品亚洲 | 重口h文| 国内精品久久久久久久久 | 秋霞黄色网 | 欧美视频一区二区三区四区 | 欧美黄色a级 | 欧洲最强rapper网站直播 | 波多野结av衣东京热无码专区 | 国产精品久久久久久久 | 日韩欧美在线观看一区二区 | 秋霞成人午夜鲁丝一区二区三区 | 欧美mv日韩mv国产 | 五月天福利视频 | 成人高清视频免费观看 | 贝利弗山的秘密1985版免费观看 | 日本a√在线观看 | 熟女毛毛多熟妇人妻aⅴ在线毛片 | 大陆女明星乱淫合集 | 毛片a级片 | 传媒av在线 | 国产美女永久无遮挡 | 麻豆一区二区三区 | 久久精品无码专区 | 美女aaa| 91色| 日韩精品一区二区三区视频 | 国产白丝袜美女久久久久 | 经典三级av在线 | 国产黄色片免费观看 | 日韩激情一区二区三区 | 秋霞99 | av色资源 |