pytorch版的bilstm+crf实现sequence label
生活随笔
收集整理的這篇文章主要介紹了
pytorch版的bilstm+crf实现sequence label
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
在理解CRF的時候費了一些功夫,將一些難以理解的地方稍微做了下標注,隔三差五看看加強記憶, 代碼是pytorch文檔上的example
import torch
import torch.autograd as autograd
import torch.nn as nn
import torch.optim as optimdef to_scalar(var): #var是Variable,維度是1# returns a python floatreturn var.view(-1).data.tolist()[0]def argmax(vec):# return the argmax as a python int_, idx = torch.max(vec, 1)return to_scalar(idx)def prepare_sequence(seq, to_ix):idxs = [to_ix[w] for w in seq]tensor = torch.LongTensor(idxs)return autograd.Variable(tensor)# Compute log sum exp in a numerically stable way for the forward algorithm
def log_sum_exp(vec): #vec是1*5, type是Variablemax_score = vec[0, argmax(vec)]#max_score維度是1, max_score.view(1,-1)維度是1*1,max_score.view(1, -1).expand(1, vec.size()[1])的維度是1*5max_score_broadcast = max_score.view(1, -1).expand(1, vec.size()[1]) # vec.size()維度是1*5return max_score + torch.log(torch.sum(torch.exp(vec - max_score_broadcast)))#為什么指數之后再求和,而后才log呢class BiLSTM_CRF(nn.Module):def
總結
以上是生活随笔為你收集整理的pytorch版的bilstm+crf实现sequence label的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 【Tensorflow】tf.nn.at
- 下一篇: Pytorch的LSTM的理解