Tensorflow API 总结

  1. Overview
  2. tf.nn.embedding_lookup()
  3. tf.sequence_mask
  4. tf.expand_dims
  5. tf.gather()
  6. tf.reshape()
  7. tf.split()
  8. tf.transpose()

Overview

  • 总结tf常规操作

tf.nn.embedding_lookup()

tf.nn.embedding_lookup(
  params,
  ids,
  partition_strategy='mod',
  name=None,
  validate_indices=True,
  max_norm=None
)

Looks up ids in a list of embedding tensors. This function is used to perform parallel lookups on the list of tensors in params. It is a generalization of tf.gather, where params is interpreted as a partitioning of a large embedding tensor. params may be a PartitionedVariable as returned by using tf.get_variable() with a partitioner. 按照ids顺序返回params中的第ids行。

  • If len(params) > 1, each element id of ids is partitioned between the elements of params according to the partition_strategy. In all strategies, if the id space does not evenly divide the number of partitions, each of the first (max_id + 1) % len(params) partitions will be assigned one more id.
  • If partition_strategy is “mod”, we assign each id to partition p = id % len(params). For instance, 13 ids are split across 5 partitions as: [[0, 5, 10], [1, 6, 11], [2, 7, 12], [3, 8], [4, 9]]
  • If partition_strategy is “div”, we assign ids to partitions in a contiguous manner. In this case, 13 ids are split across 5 partitions as: [[0, 1, 2], [3, 4, 5], [6, 7, 8], [9, 10], [11, 12]]

The results of the lookup are concatenated into a dense tensor. The returned tensor has shape shape(ids) + shape(params)[1:].

demo

a = np.array(
  [[0.1, 0.2, 0.3],
  [1.1, 1.2, 1.3],
  [2.1, 2.2, 2.3],
  [3.1, 3.2, 3.3],
  [4.1, 4.2, 4.3]])
idx1 = tf.Variable([0, 2, 3, 1], tf.int32)
idx2 = tf.Variable([[0, 2, 3, 1], [4, 0, 2, 2]], tf.int32)
out1 = tf.nn.embedding_lookup(a, idx1)
out2 = tf.nn.embedding_lookup(a, idx2)
init = tf.global_variables_initializer()

with tf.Session() as sess:
  sess.run(init)
  print(a)
  print('=========')
  print(sess.run(out1))
  print('=========')
  print(sess.run(out2))

result

[[0.1 0.2 0.3]
 [1.1 1.2 1.3]
 [2.1 2.2 2.3]
 [3.1 3.2 3.3]
 [4.1 4.2 4.3]]
=========
[[0.1 0.2 0.3]
 [2.1 2.2 2.3]
 [3.1 3.2 3.3]
 [1.1 1.2 1.3]]
=========
[[[0.1 0.2 0.3]
  [2.1 2.2 2.3]
  [3.1 3.2 3.3]
  [1.1 1.2 1.3]]
 [[4.1 4.2 4.3]
  [0.1 0.2 0.3]
  [2.1 2.2 2.3]
  [2.1 2.2 2.3]]]

tf.sequence_mask

tf.sequence_mask(
  lengths,
  maxlen=None,
  dtype=tf.bool,
  name=None
)

Returns a mask tensor representing the first N positions of each cell. If lengths has shape [d_1, d_2, …, d_n] the resulting tensor mask has dtype dtype and shape [d_1, d_2, …, d_n, maxlen], with
mask[i_1, i_2, …, i_n, j] = (j < lengths[i_1, i_2, …, i_n])

Examples:

tf.sequence_mask([1, 3, 2], 5) 
# [[True, False, False, False, False], 
# [True, True, True, False, False], 
# [True, True, False, False, False]]

另外

tf.sequence_mask([[1, 3],[2,0]]) 
# [[[True, False, False], 
# [True, True, True]], 
# [[True, True, False], 
# [False, False, False]]]

tf.expand_dims

tf.expand_dims( 
  input, 
  axis=None, 
  name=None, 
  dim=None
)

See the guide: Tensor Transformations > Shapes and Shaping

Inserts a dimension of 1 into a tensor’s shape.

Given a tensor input, this operation inserts a dimension of 1 at the dimension index axis of input’s shape. The dimension index axis starts at zero; if you specify a negative number for axis it is counted backward from the end.

This operation is useful if you want to add a batch dimension to a single element. For example, if you have a single image of shape [height, width, channels], you can make it a batch of 1 image with expand_dims(image, 0), which will make the shape [1, height, width, channels].

Other examples:

# 't' is a tensor of shape [2]
tf.shape(tf.expand_dims(t, 0))  # [1, 2]
tf.shape(tf.expand_dims(t, 1))  # [2, 1]
tf.shape(tf.expand_dims(t, -1))  # [2, 1]

# 't2' is a tensor of shape [2, 3, 5]
tf.shape(tf.expand_dims(t2, 0))  # [1, 2, 3, 5]
tf.shape(tf.expand_dims(t2, 2))  # [2, 3, 1, 5]
tf.shape(tf.expand_dims(t2, 3))  # [2, 3, 5, 1]

This operation requires that:

-1-input.dims() <= dim <= input.dims()

This operation is related to squeeze(), which removes dimensions of size 1.

tf.gather()

tf.gather( 
  params, 
  indices, 
  validate_indices=None,
  name=None, 
  axis=0
)

indices must be an integer tensor of any dimension (usually 0-D or 1-D). Produces an output tensor with shape params.shape[:axis] + indices.shape + params.shape[axis + 1:] where:

import tensorflow as tf
x = tf.range(0, 10)*10 + tf.constant(1, shape=[10])
y = tf.gather(x, [1, 5, 9])
with tf.Session() as sess:
  print(sess.run(x))
  print(sess.run(y))

result

[ 1 11 21 31 41 51 61 71 81 91]
[11 51 91]

tf.reshape()

tf.reshape(
  tensor,
  shape,
  name=None
)

Given tensor, this operation returns a tensor that has the same values as tensor with shape shape.

If one component of shape is the special value -1, the size of that dimension is computed so that the total size remains constant. In particular, a shape of [-1] flattens into 1-D. At most one component of shape can be -1. If shape is 1-D or higher, then the operation returns a tensor with shape shape filled with the values of tensor. In this case, the number of elements implied by shape must be the same as the number of elements in tensor.

demo

a = tf.constant(np.array(list(range(1, 9+1, 1))))
b = tf.reshape(a, [3, 3])
# t: a (3 x 2 x 3) tensor
t = tf.constant(
  [[[1, 1, 1],
  [2, 2, 2]],
  [[3, 3, 3],
  [4, 4, 4]],
  [[5, 5, 5],
  [6, 6, 6]]]
)
c = tf.reshape(t, [2, -1])
d = tf.reshape(t, [-1, 9])
e = tf.reshape(t, [2, -1, 3])

with tf.Session() as sess:
  print(sess.run(a), end='\n\n')
  print(sess.run(b), end='\n\n')
  print(b.get_shape().as_list())
  print(sess.run(t), end='\n\n')
  print(t.get_shape().as_list())
  print(sess.run(c), end='\n\n')
  print(c.get_shape().as_list())
  print(sess.run(d), end='\n\n')
  print(d.get_shape().as_list())
  print(sess.run(e), end='\n\n')
  print(e.get_shape().as_list())

result

[[1 2 3]
 [4 5 6]]

[[1 4]
 [2 5]
 [3 6]]

[[1 4]
 [2 5]
 [3 6]]

[[[ 1  2  3]
  [ 4  5  6]]

 [[ 7  8  9]
  [10 11 12]]]

[2, 2, 3]

[[[ 1  4]
  [ 2  5]
  [ 3  6]]

 [[ 7 10]
  [ 8 11]
  [ 9 12]]]

[2, 3, 2]

tf.split()

tf.split(
  value,
  num_or_size_splits,
  axis=0,
  num=None,
  name='split'
)

Splits a tensor into sub tensors.

If num_or_size_splits is an integer type, num_split, then splits value along dimension axis into num_split smaller tensors. Requires that num_split evenly divides value.shape[axis]. If num_or_size_splits is not an integer type, it is presumed to be a Tensor size_splits, then splits value into len(size_splits) pieces. The shape of the i-th piece has the same size as the value except along dimension axis where the size is size_splits[i].

demo

a = tf.constant(np.reshape(list(range(1, 6+1, 1)), (2, 3)))
b1, b2, b3 = tf.split(a, 3, axis=1)
d1, d2, d3 = tf.split(a, [1, 1, 1], axis=1)
c1, c2 = tf.split(a, [1, 2], axis=1)

with tf.Session() as sess:
  print(sess.run(a), end='\n\n')
  print(sess.run(b1), end='\n\n')
  print(sess.run(b2), end='\n\n')
  print(sess.run(b3), end='\n\n')
  print(sess.run(d1), end='\n\n')
  print(sess.run(d2), end='\n\n')
  print(sess.run(d3), end='\n\n')
  print(sess.run(c1), end='\n\n')
  print(sess.run(c2), end='\n\n')
result
[[1 2 3]
 [4 5 6]]

[[1]
 [4]]

[[2]
 [5]]

[[3]
 [6]]

[[1]
 [4]]

[[2]
 [5]]

[[3]
 [6]]

[[1]
 [4]]

[[2 3]
 [5 6]]

tf.transpose()

demo

a = tf.constant(
  [[1, 2, 3],
  [4, 5, 6]]
)
b = tf.transpose(a)
# equivaltently
c = tf.transpose(a, [1, 0])
# x : 2 x 2 x 3
x = tf.constant(
  [[[1, 2, 3],
  [4, 5, 6]],
  [[7, 8, 9],
  [10, 11, 12]]]
)
# perm is more useful for n-dimensional tersors, for n > 2
# y : 2 x 3 x 2
y = tf.transpose(x, perm=[0, 2, 1])

with tf.Session() as sess:
  print(sess.run(a), end='\n\n')
  print(sess.run(b), end='\n\n')
  print(sess.run(c), end='\n\n')

  print(sess.run(x), end='\n\n')
  print(x.get_shape().as_list(), end='\n\n')
  print(sess.run(y), end='\n\n')
  print(y.get_shape().as_list(), end='\n\n')

result

[[1 2 3]
 [4 5 6]]

[[1 4]
 [2 5]
 [3 6]]

[[1 4]
 [2 5]
 [3 6]]

[[[ 1  2  3]
  [ 4  5  6]]

 [[ 7  8  9]
  [10 11 12]]]

[2, 2, 3]

转载请注明来源, from goldandrabbit.github.io

💰

×

Help us with donation