ELMo is a pre-trained model provided by google for creating word embeddings. It can be used directly from TensorFlow hub.
ELMo doesn't work with TF2.0, for running the code in this post make sure you are using TF 1.15.0
pip install tensorflow==1.15.0
pip install tensorflow_hub
import tensorflow_hub as hub
import tensorflow as tf
print(tf.version.VERSION)
# Load ELMo model from TensorFlow hun
elmo = hub.Module("https://tfhub.dev/google/elmo/2", trainable=True)
# Provide input tensor and create embeddings
input_tensor = ["This tutorial is on elmo embeddings from tensorflow hub ",
"TensorFlow hub provides many reusable pre trained models in several domains"]
embeddings_tensor = elmo(input_tensor,
signature="default",
as_dict=True)["elmo"]
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
embeddings = sess.run(embeddings_tensor)
print(embeddings.shape)
print(embeddings)
===== Output =====
(2, 11, 1024)
[[[-0.5142068 -0.4636345 0.06954885 ... -0.2707631 -0.02428255
-0.00331099]
[-0.95270574 -0.04734205 0.42021 ... 0.0421902 0.5397179
0.26651388]
[-0.2660544 0.02247591 -0.18202654 ... -0.4068279 0.49980593
0.74724764]
...
[ 0.18841442 0.10420661 -0.07694399 ... -0.23975265 0.04203938
0.12900949]
[-0.02840841 -0.04353216 0.04130162 ... 0.02583168 -0.01429836
-0.01650422]
[-0.02840841 -0.04353216 0.04130162 ... 0.02583168 -0.01429836
-0.01650422]]
[[-0.07535858 -0.4540003 0.14424387 ... -0.10658199 -0.04327318
0.187071 ]
[ 0.2766684 0.24213223 -0.13952938 ... -0.5453592 0.32170922
0.42950815]
[-0.08102435 0.7172947 -0.3830716 ... 0.33446005 0.2105927
0.59616154]
...
[ 0.05506064 0.699861 -0.0179209 ... -0.7185075 -0.3778218
-0.04791632]
[ 0.25612986 0.6162555 0.17239812 ... 0.28734833 -0.35074
-0.1823772 ]
[ 0.36258197 0.20596306 0.6186662 ... -0.28990933 0.24846411
-0.20570272]]]
Output tensor has shape [batch_size, max_length, 1024], in this example we have list of 2 inputs, max_length of sentence is 11 , hence shape of the the output tensor is (2, 11, 1024).
Category: TensorFlow