利用RNN神經網絡自動生成唐詩宋詞

         RNN(Recurrent Neural Networks)在處理長序列有很強的優勢,加上近來前向反饋算法的成功,導致RNN在長文本上得到了很好的應用。

        簡單來說RNN神經網絡能夠記住長序列中的某種特徵,因此可以很好處理時序信息,RNN可以處理多種時序信息,其中應用最廣泛的是在文本上的處理,包含了文本情感分析,文本的自動生成。對於英文的詩歌的自動生成國外做的比較多,對於漢字的生成相對較少。我們古代的詩歌特別是唐詩宋詞浩如煙海,唐詩宋詞本身就有一定的內在規律,通過神經網絡來發現這樣的規律並表示出來就可以實現機器作詩。

       首先你需要訓練樣本,我通過網上搜集40000多首的唐詩,他們大概這個樣子。

 

     

  然後我們需要進行漢字的embedding,embedding的研究已經取得了很大的進展,在這裏我們只是簡單地進行處理,簡單來說我統計所有漢字的詞頻,然後按照詞頻從高到低進行排序,這樣我就獲得了每個漢字和一個列表序號的映射關係。

 

poetry_file ='poetry.txt'

# 詩集
poetrys = []
with open(poetry_file, "r", encoding='utf-8') as f:
#with open(poetry_file, "r") as f:
#with codecs.open(poetry_file, "r", 'utf-8') as f:
	for line in f:
		try:
			title, content = line.strip().split(':')
			content = content.replace(' ', '')
			if '_' in content or '(' in content or '(' in content or '《' in content or '[' in content:
				continue
			if len(content) < 5 or len(content) > 79:
				continue
			content = '[' + content + ']'
			poetrys.append(content)
		except Exception as e:
			pass

# 按詩的字數排序
poetrys = sorted(poetrys,key=lambda line: len(line))
print('唐詩總數: ', len(poetrys))

# 統計每個字出現次數
all_words = []
for poetry in poetrys:
	all_words += [word for word in poetry]
counter = collections.Counter(all_words)
count_pairs = sorted(counter.items(), key=lambda x: -x[1])
words, _ = zip(*count_pairs)

# 取前多少個常用字
words = words[:len(words)] + (' ',)
# 每個字映射爲一個數字ID
word_num_map = dict(zip(words, range(len(words))))
# 把詩轉換爲向量形式,參考TensorFlow練習1
to_num = lambda word: word_num_map.get(word, len(words))
poetrys_vector = [ list(map(to_num, poetry)) for poetry in poetrys] 

     

       通過了embedding我們就可以將每一首詩會轉化爲一個多維向量,維度的個數代表漢字的個數。

        我們利用rnn神經網絡對每一首詩進行訓練,RNN的神經網絡的搭建現在都比較固定了。具體可以參考Google的Tensorflow的官方文檔。

 

def neural_network(model='lstm', rnn_size=128, num_layers=2):
	if model == 'rnn':
		cell_fun = tf.nn.rnn_cell.BasicRNNCell
		#cell_fun = tf.contrib.rnn.BasicRNNCell
	elif model == 'gru':
		cell_fun = tf.nn.rnn_cell.GRUCell
	elif model == 'lstm':
		#cell_fun = tf.nn.rnn_cell.BasicLSTMCell
		cell_fun = tf.nn.rnn_cell.BasicLSTMCell
        #tf.contrib.rnn.BasicRNNCell
	cell = cell_fun(rnn_size, state_is_tuple=True)
	cell = tf.nn.rnn_cell.MultiRNNCell([cell] * num_layers, state_is_tuple=True)
	initial_state = cell.zero_state(batch_size, tf.float32)

	with tf.variable_scope('rnnlm'):
		softmax_w = tf.get_variable("softmax_w", [rnn_size, len(words)+1])
		softmax_b = tf.get_variable("softmax_b", [len(words)+1])
		with tf.device('/gpu:0'):
			embedding = tf.get_variable("embedding", [len(words)+1, rnn_size])
			inputs = tf.nn.embedding_lookup(embedding, input_data)

	outputs, last_state = tf.nn.dynamic_rnn(cell, inputs, initial_state=initial_state, scope='rnnlm')
	output = tf.reshape(outputs,[-1, rnn_size])

	logits = tf.matmul(output, softmax_w) + softmax_b
	probs = tf.nn.softmax(logits)
	return logits, last_state, probs, cell, initial_state

 

      搭建好神經網絡之後我們就可以進行訓練了,我們採用分批訓練,每64首訓練一次。

 

def train_neural_network():
	logits, last_state, _, _, _ = neural_network()
	targets = tf.reshape(output_targets, [-1])
	loss = tf.contrib.legacy_seq2seq.sequence_loss_by_example([logits], [targets], [tf.ones_like(targets, dtype=tf.float32)], len(words))
	cost = tf.reduce_mean(loss)
	learning_rate = tf.Variable(0.0, trainable=False)
	tvars = tf.trainable_variables()
	grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), 5)
	optimizer = tf.train.AdamOptimizer(learning_rate)
	train_op = optimizer.apply_gradients(zip(grads, tvars))

	with tf.Session(config=config) as sess:
		sess.run(tf.global_variables_initializer())
		saver = tf.train.Saver(tf.all_variables())

		for epoch in range(50):
			sess.run(tf.assign(learning_rate, 0.002 * (0.97 ** epoch)))
			n = 0
			for batche in range(n_chunk):
				train_loss, _ , _ = sess.run([cost, last_state, train_op], feed_dict={input_data: x_batches[n], output_targets: y_batches[n]})
				n += 1
				print(epoch, batche, train_loss)
			if epoch % 7 == 0:
				saver.save(sess, './train_dir/poetry.ckpt', global_step=epoch)

 

我們訓練結束後保存模型。

         我們下次直接使用這個模型,採用隨機開始,這樣每次都生成不同的詩。當然這裏涉及到了停止的問題,我會在每一首詩的後面加一個截斷符,這樣網絡就會學習到這樣的特徵。

 

def gen_poetry():
	def to_word(weights):
		t = np.cumsum(weights)
		s = np.sum(weights)
		sample = int(np.searchsorted(t, np.random.rand(1)*s))
		return words[sample]

	_, last_state, probs, cell, initial_state = neural_network()
	result = ""

	with tf.Session() as sess:
		sess.run(tf.global_variables_initializer())

		saver = tf.train.Saver(tf.all_variables())

		module_file = tf.train.latest_checkpoint('./train_dir')
		print(module_file)
		saver.restore(sess, module_file)

		state_ = sess.run(cell.zero_state(1, tf.float32))

		x = np.array([list(map(word_num_map.get, '['))])
		[probs_, state_] = sess.run([probs, last_state], feed_dict={input_data: x, initial_state: state_})
		word = to_word(probs_)
		#word = words[np.argmax(probs_)]
		poem = ''
		while word != ']':
			poem += word
			x = np.zeros((1, 1))
			x[0, 0] = word_num_map[word]
			[probs_, state_] = sess.run([probs, last_state], feed_dict={input_data: x, initial_state: state_})
			word = to_word(probs_)
			#word = words[np.argmax(probs_)]
		result = poem
	return result

運行結果如下:

每次運行生成都是不同的唐詩。

生成的幾首詩如下:

poetry1:東遠春生夢,浮波奔浩氛。光繁空井碧,池輩正無塵。茗牖藏田畔,雲霞有瑞香。煙波阻此去,風景向秦關。枕外無多跡,臨朝半鏡明。誰憐竹洞裏,終可遣忘衡。

poetry2:行深復何路,異客動郊山。又失天涯外,孤舟行處稀。共知緣衛渡,又上故鄉情。月有妝齋滿,野心迎夕天。塞風岡自入,谷口和蹤息。修菊倍傍人,結人難相慰,還是若雲棲。

poetry3:莫訝翼憧鞬事,至楊初駐袖中筵。輕竿留戴黃蓑楫,慘淡時將六隊聲。晴落彩雲依郭處,惡雲移以賦行人。那堪數曲回車職,更見纖塵亦恐眠。

github:https://github.com/danzhewuju

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章