ABOUT ME

전공자맞습니다. 코딩못합니다ㅠ

Today
Yesterday
Total
  • #11. Tensorflow Tutorial
    연구실 2019. 10. 10. 16:10

    * Exploring the Tensorflow Library

    - Tensowflow에서 프로그램을 실행할 때는 다음과 같은 과정을 거친다.

        (1) Create Tensors(variables) that are not yet executed/evaluated

        (2) Write operations between those Tensors

        (3) Initialize the Tensors

        (4) Create a Session

        (5) Run the Session. This will run the operation on (2)

    y_hat = tf.constant(36, name='y_hat')            # Define y_hat constant. Set to 36.
    y = tf.constant(39, name='y')                    # Define y. Set to 39
    
    loss = tf.Variable((y - y_hat)**2, name='loss')  # Create a variable for the loss
    
    init = tf.global_variables_initializer()         # When init is run later (session.run(init)),
                                                     # the loss variable will be initialized and ready to be computed
    with tf.Session() as session:                    # Create a session and print the output
        session.run(init)                            # Initializes the variables
        print(session.run(loss))                     # Prints the loss

     

    - 예를들어

    a = tf.constant(2)
    b = tf.constant(10)
    c = tf.multiply(a,b)
    print(c)

    와 같은 경우, Tensor를 선언하고 operation을 만들었을 뿐 아직 실제로 돌리지 않았기 때문에 결과 값은 20이 아니라 Tensor("Mul:0", shape=(), dtype=int32)이 된다.

     

    sess = tf.Session()
    print(sess.run(c))

    - 이런 식으로 Session을 만들어 실행을 시켜야 결과가 올바르게 나온다.

     

    - Placeholder: an object whose value you can specify only later

    - placeholder에 값을 넣어주고 싶으면, "feed dictionary"를 이용해 값을 전달시켜야 한다.

    # Change the value of x in the feed_dict
    
    x = tf.placeholder(tf.int64, name = 'x')
    print(sess.run(2 * x, feed_dict = {x: 3}))
    sess.close()

    - 처음에 placeholder를 만들 때에는 값을 특정하지 않아도 된다. 나중에 이 placeholder에 feed data하여 session을 실행한다.

     

     

    (1) Linear function

    - Y = WX + b를 예로 들어보자.

    X = tf.constant(np.random.randn(3, 1), name = 'X')
    W = tf.constant(np.random.randn(4, 3), name = 'W')
    b = tf.constant(np.random.randn(4, 1), name = 'b')
    Y = tf.add(tf.matmul(W, X), b)
    
    sess = tf.Session()
    result = sess.run(Y)

     

    (2) Computing the sigmoid

    - Tensorflow는 tf.sigmoid, tf.softmax 같이 자주 사용되는 함수들을 제공한다.

    def sigmoid(z):
        """
        Computes the sigmoid of z
        
        Arguments:
        z -- input value, scalar or vector
        
        Returns: 
        results -- the sigmoid of z
        """
        
        # Create a placeholder for x. Name it 'x'.
        x = tf.placeholder(tf.float32, name = 'x')
    
        # compute sigmoid(x)
        sigmoid = tf.sigmoid(x)
    
        # Create a session, and run it. Please use the method 2 explained above. 
        # You should use a feed_dict to pass z's value to x. 
        with tf.Session() as sess:
        # Run session and call the output "result"
            result = sess.run(sigmoid, feed_dict = {x: z})
    
        
        return result

    - 세션을 만들고 사용하는 방법:

        (1) sess = tf.Session()

            result = sess.run(..., feed_dict = {...})

            sess.close()

        (2) with tf.Session() as sess:

                 result = sess.run(.., feed_dict = {...})

     

    - feed_dict = {x: z} -> x를 placeholder로 받았으니 feed_dict를 사용해 z의 값을 나중에 x로 전달해주어야 한다.

     

    (3) Computing the Cost

    - cost를 구하는 식도 내장함수로 존재한다.

    - cross entropy loss 계산 함수: tf.nn.sigmoid_cross_entropy_with_logits(logits = ..., labels = ...)

    def cost(logits, labels):
        """
        Computes the cost using the sigmoid cross entropy
        
        Arguments:
        logits -- vector containing z, output of the last linear unit (before the final sigmoid activation)
        labels -- vector of labels y (1 or 0) 
        
        Note: What we've been calling "z" and "y" in this class are respectively called "logits" and "labels" 
        in the TensorFlow documentation. So logits will feed into z, and labels into y. 
        
        Returns:
        cost -- runs the session of the cost (formula (2))
        """
        
        # Create the placeholders for "logits" (z) and "labels" (y) (approx. 2 lines)
        z = tf.placeholder(tf.float32, name='z')
        y = tf.placeholder(tf.float32, name='y')
        
        # Use the loss function (approx. 1 line)
        cost = tf.nn.sigmoid_cross_entropy_with_logits(logits = z,  labels = y)
        
        # Create a session (approx. 1 line). See method 1 above.
        sess = tf.Session()
        
        # Run the session (approx. 1 line).
        cost = sess.run(cost, feed_dict={z: logits, y: labels})
        
        # Close the session (approx. 1 line). See method 1 above.
        sess.close()
    
        
        return cost

     

    (4) Using One Hot encodings

    - tf.one_hot(labels, depth, axis) 함수로 한번에 계산할 수 있다.

    def one_hot_matrix(labels, C):
        """
        Creates a matrix where the i-th row corresponds to the ith class number and the jth column
                         corresponds to the jth training example. So if example j had a label i. Then entry (i,j) 
                         will be 1. 
                         
        Arguments:
        labels -- vector containing the labels 
        C -- number of classes, the depth of the one hot dimension
        
        Returns: 
        one_hot -- one hot matrix
        """
    
        # Create a tf.constant equal to C (depth), name it 'C'. (approx. 1 line)
        C = tf.constant(C, name = 'C')
        
        # Use tf.one_hot, be careful with the axis (approx. 1 line)
        one_hot_matrix = tf.one_hot(labels, C, axis = 0)
        
        # Create the session (approx. 1 line)
        sess = tf.Session()
        
        # Run the session (approx. 1 line)
        one_hot = sess.run(one_hot_matrix)
        
        # Close the session (approx. 1 line). See method 1 above.
        sess.close()
        
        return one_hot

     

     

    (5) Initialize with zeros and ones

    - tf.ones() / tf.zeros()

     

    '연구실' 카테고리의 다른 글

    댓글

©hyunbul