πŸ“—
JunegLee's TIL
  • TIL
  • python
    • class
    • String Basic
    • regularExpression
    • String function
    • Generator
    • String format
    • getset
    • module
    • while
    • numpy
    • print()
    • matplotlib
    • for
    • Boolean
    • tuple
    • package
    • input(variable)
    • list
    • if
    • file
    • type()
    • pandas
    • function
    • dictionary
    • ꡬ문 였λ₯˜μ™€ μ˜ˆμ™Έ
    • builtinFunction
    • Constructor
  • algorithm
    • sort
      • mergeSort
      • insertionSort
      • bubbleSort
      • heapSort
      • quickSort
      • selectionSort
    • recursion
    • Greedy
    • DepthFirstSearch
    • basic
      • DataStructure
    • hash
    • BreadthFirstSearch
  • tensorflow
    • keras
      • layers
        • Flatten
        • Flatten
        • Dense
        • Dense
        • Conv2D
        • Conv2D
    • tensorflow1x
    • tensorflow2x
  • DB
    • setting
    • join
    • subQuery
    • overview
  • deep-learning
    • neuralNetwork
    • perceptron
    • neuralNetworkLearning
    • convolution neural network
    • Gradient Descent
    • Linear Regression
    • backPropagation
    • logistic regression
    • overview
  • textPreprocessing
    • overview
  • java
    • basics
      • generic
      • Variable
      • String
    • theory
      • Object Oriented Programing
  • NLP
    • Embedding
    • Natural Language Processing
Powered by GitBook
On this page
  • tensorflow (version 2.x)
  • ν…μ„œν”Œλ‘œ 2.x의 이해
  • μ¦‰μ‹œ μ‹€ν–‰ (eager execution)
  • μ˜€ν† κ·Έλž˜ν”„(AutoGraph)
  • μΌ€λΌμŠ€ API : 3κ°€μ§€ ν”„λ‘œκ·Έλž˜λ° λͺ¨λΈ
  • 순차적(Sequentail) API
  • ν•¨μˆ˜μ (function) API
  • λͺ¨λΈ μ„œλΈŒν΄λž˜μ‹±
  • 콜백 (callback)
  • λͺ¨λΈκ³Ό κ°€μ€‘μΉ˜ μ €μž₯
  • 데이터셋 라이브러리

Was this helpful?

  1. tensorflow

tensorflow2x

tensorflow (version 2.x)

ν…μ„œν”Œλ‘œ 2.x의 이해

: ν…μ„œν”Œλ‘œ 2.xλŠ” μΌ€λΌμŠ€(keras)와 같은 ν•˜μ΄ 레벨 APIλ₯Ό μ‚¬μš©ν•  것을 ꢌμž₯ν•˜λ©°, λ‚΄λΆ€μ˜ 세뢀정보λ₯Ό 더 많이 μ œμ–΄ν•΄μ•Ό ν•˜λŠ” 경우 ν…μ„œν”„λ‘€ 1.x인 둜우레벨 APIλ₯Ό κ·ΈλŒ€λ‘œ μœ μ§€ν•œλ‹€

μ¦‰μ‹œ μ‹€ν–‰ (eager execution)

: ν…μ„œ ν”Œλ‘œ 2.x은 ν…μ„œν”Œλ‘œ 1.x의 정적 계산 κ·Έλž˜ν”„λ₯Ό μ •μ˜ν•œκ²ƒκ³Ό λ‹€λ₯΄κ²Œ, νŠΉλ³„ν•œ μ„Έμ…˜ μΈν„°νŽ˜μ΄μŠ€λ‚˜ ν”Œλ ˆμ΄μŠ€ 홀더 업이도 λ…Έλ“œλ₯Ό μ¦‰μ‹œ μ •μ˜, λ³€κ²½, μ‹€ν–‰ν•  수 있으며, 이것이 λ°”λ‘œ μ¦‰μ‹œ 싀행이닀 . 즉, λͺ¨λΈ μ •μ˜κ°€ 동적이고 싀행이 μ¦‰μ‹œ 이뀄진닀, κ·Έλž˜ν”„μ™€ μ„Έμ…˜μ€ κ΅¬ν˜„ μ„ΈλΆ€ μ‚¬ν•­μœΌλ‘œ κ³ λ €ν•΄μ•Ό ν•œλ‹€

μ˜€ν† κ·Έλž˜ν”„(AutoGraph)

: ν…μ„œ ν”Œλ‘œ 2.x은 기본적으둜 if-while, print()κ³Ό 같이 κΈ°λ³Έ νŠΉμ§•κ³Ό 같은 μ œμ–΄νλ¦„μ„ 포함해 λͺ…λ Ήν˜• 파이썬 μ½”λ“œλ₯Ό μ§€μ›ν•˜μ—¬, 투λͺ…ν•˜κ³  역동적이며 μ¦‰μ‹œ μ‹€ν–‰ 파이썬 ν˜•μ‹ ν”„λ‘œκ·Έλž˜λ°κ³Ό 효율적인 κ·Έλž˜ν”„ 계산을 톡해 두세계λ₯Ό λͺ¨λ‘ ν™œμš©ν•˜λŠ” 연결고리λ₯Ό λ§Œλ“ λ‹€

  • μ˜€ν†  κ·Έλž˜ν”„ μ‚¬μš©λ°©λ²•

    : 파이썬 μ½”λ“œμ— νŠΉμ • λ°μ½”λ ˆμ΄ν„°(decorator) tf.function을 μ–΄λ…Έλ°μ΄μ…˜(annotation)처럼 써주기만 ν•˜λ©΄λœλ‹€.

import tensorflow as tf

def linear_layer(x):
  return 3 * x + 2

@tf.function
def simple_nn(x):
  return tf.nn.relu(linear_layer(x))

def simple_function(x):
    return 3*x
  • simple_nn을 μ‚΄νŽ΄λ³΄λ©΄ ν…μ„œν”Œλ‘œ 내뢀와 μƒν˜Έμž‘μš©ν•˜λŠ” 특수 ν•Έλ“€λΌλŠ” 것을 μ•Œμˆ˜ 있으며, simple_function은 일반 파이썬 ν•Έλ“€λŸ¬λ‹€.

  • tf.function을 μ‚¬μš©ν•  경우 ν•˜λ‚˜μ˜ μ£Ό ν•¨μˆ˜μ—λ§Œ μ–΄λ…Έν…Œμ΄μ…˜μ„ 달면 κ±°κΈ°μ—μ„œ 호좜된 λ‹€λ₯Έ λͺ¨λ“  ν•¨μˆ˜λŠ” μžλ™μœΌλ‘œ 투λͺ…ν•˜κ²Œ μ΅œμ ν™”λœ 계산 κ·Έλž˜ν”„λ‘œ λ³€ν™˜λœλ‹€

import  tensorflow as tf
import timeit

cell = tf.keras.layers.LSTMCell(100)

@tf.function
def fn(input, state):
    return cell(input, state)

input = tf.zeros([100, 100])
state = [tf.zeros([100, 100])] * 2
# warmup
cell(input, state)
fn(input, state)

graph_time = timeit.timeit(lambda: cell(input, state), number=100)
auto_graph_time = timeit.timeit(lambda: fn(input, state), number=100)
print('graph_time:', graph_time)
print('auto_graph_time:', auto_graph_time)
graph_time: 0.09712530000000008
auto_graph_time: 0.0456865999999998

μΌ€λΌμŠ€ API : 3κ°€μ§€ ν”„λ‘œκ·Έλž˜λ° λͺ¨λΈ

  • μΌ€λΌμŠ€λŠ” 순차적 API, ν•¨μˆ˜μ  API, λͺ¨λΈ μ„œλΈŒν΄λž˜μ‹±μ˜ μ„Έκ°€μ§€ ν”„λ‘œκ·Έλž˜λ° λͺ¨λΈκ³Ό ν•¨κ»˜ 더 ν•˜μ΄λ ˆλ²¨ APIλ₯Ό μ œκ³΅ν•œλ‹€

순차적(Sequentail) API

  • 순차적 APIλŠ” 90%의 사둀에 μ ν•©ν•œ 맀우 μš°μ•„ν•˜κ³  직관적이며 κ°„κ²°ν•œ λͺ¨λΈμ΄λ‹€.

tf.keras.utils.plot_model(model, to_file="model.png")

ν•¨μˆ˜μ (function) API

  • ν•¨μˆ˜μ  APIλŠ” 닀쀑 μž…λ ₯, 닀쀑 좜λ ₯, λΉ„μˆœμ°¨ νλ¦„κ³Όμ˜ μž”μ‘΄ μ—°κ²°, 곡유, μž¬μ‚¬μš© κ°€λŠ₯ 계측을 포함해 쒀더 λ³΅μž‘ν•œ(λΉ„μ„ ν˜•) μœ„μƒ(topology)으둜 λͺ¨λΈμ„ κ΅¬μΆ•ν•˜λ €λŠ” 경우 μœ μš©ν•˜λ‹€

  • 각 계측은 호좜 κ°€λŠ₯ν•˜κ³ (μž…λ ₯은 ν…μ„œ) 각 계측은 ν…μ„œλ₯Ό 좜λ ₯으둜 λ°˜ν™˜ν•œλ‹€.

  • λ‘κ°œμ˜ κ°œλ³„ μž…λ ₯, 두 개의 κ°œλ³„ λ‘œμ§€μŠ€ν‹± νšŒκ·€λ₯Ό 좜λ ₯으둜, ν•˜λ‚˜μ˜ 곡유 λͺ¨λ“ˆμ„ 쀑간에 κ°–λŠ” 예제

import tensorflow as tf

def build_model():
    # κ°€λ³€ 길이 μ •μˆ˜μ˜ μ‹œν€€μŠ€
    text_input_a = tf.keras.Input(shape=(None,), dtype='int32')

    # κ°€λ³€ 길이 μ •μˆ˜μ˜ μ‹œν€€μŠ€
    text_input_b = tf.keras.Input(shape=(None,), dtype='int32')

    # 1000개의 고유 단어λ₯Ό 128차원 벑터에 λ§€ν•‘ν•΄μ„œ μž„λ² λ”©
    shared_embedding = tf.keras.layers.Embedding(1000, 128)

    # μ–‘μͺ½ μž…λ ₯을 μΈμ½”λ”©ν•˜κ³ μž λ™μΌν•œ 계측 μž¬μ‚¬μš©
    encoded_input_a = shared_embedding(text_input_a)
    encoded_input_b = shared_embedding(text_input_b)

    # μ΅œμ’…μ μœΌλ‘œ 2개의 λ‘œμ§€μŠ€ν‹± 예츑
    prediction_a = tf.keras.layers.Dense(1, activation='sigmoid', name='prediction_a')(encoded_input_a)
    prediction_b = tf.keras.layers.Dense(1, activation='sigmoid', name='prediction_b')(encoded_input_b)

    # 2개의 μž…λ ₯κ³Ό 2개의 좜λ ₯
    # κ°€μš΄λ°λŠ” λͺ¨λΈμ΄ μžˆλ‹€.
    model = tf.keras.Model(inputs=[text_input_a, text_input_b], 
    outputs=[prediction_a, prediction_b])

    tf.keras.utils.plot_model(model, to_file="shared_model.png")

build_model()
  • λΉ„μ„ ν˜• μœ„μƒμ˜ 예

λͺ¨λΈ μ„œλΈŒν΄λž˜μ‹±

  • λͺ¨λΈ μ„œλΈŒν΄λž˜μ‹±μ€ 졜고의 μœ μ—°μ„±μ„ μ œκ³΅ν•˜λ©° 일반적으둜 μžμ‹ μ˜ 계측을 μ •μ˜ν•΄μ•Ό ν•  λ•Œ μ‚¬μš©. λΉ„μœ ν•˜μžλ©΄ ν‘œμ€€μ μ΄κ³  잘 μ•Œλ €μ§„ 레고 블둝을 κ΅¬μ„±ν•˜λŠ” λŒ€μ‹  μžμ‹ λ§Œμ˜ 레고 블둝을 λ§Œλ“€κ³ μž ν• λ•Œ 유용

  • init : μ„ νƒμ μœΌλ‘œ 이 κ³„μΈ΅μ—μ„œ μ‚¬μš©ν•  λͺ¨λ“  ν•˜μœ„ 계측을 μ •μ˜ν•˜λŠ”λ° μ‚¬μš©, λͺ¨λΈ 선언을 ν• λ•ŒλŠ” μƒμ„±μžλ‹€

  • build : 게측의 κ°€μ€‘μΉ˜λ₯Ό 생성할 λ•Œ μ‚¬μš©, dd_weight()둜 κ°€μ€‘μΉ˜λ₯Ό μΆ”κ°€ ν•  수 μžˆλ‹€

  • call : 순반ν–₯ 전달 μ •μ˜, 계측이 호좜되고 ν•¨μˆ˜ ν˜•μ‹μœΌλ‘œ μ²΄μΈλ˜λŠ” 곳이닀

  • μ„ νƒμ μœΌλ‘œ get_config()λ₯Ό μ‚¬μš©ν•΄ κ²ŒμΈ΅μ„ 직렬화(serialize)ν•  수 있고, from_config()λ₯Ό μ‚¬μš©ν•˜λ©΄ μ—­μ§ˆλ ¬ν™”(deserialize)ν•  수 μžˆλ‹€

콜백 (callback)

: μ½œλ°±μ€ ν›ˆλ ¨ 쀑에 λ…μž‘μ„ ν™•μž₯ν•˜κ±°λ‚˜ μˆ˜μ •ν•˜κ³ μž λͺ¨λΈλ‘œ μ „λ‹¬ν•˜λŠ” 객체

  • ModelCheckPoint : μ •κΈ°μ μœΌλ‘œ λͺ¨λΈμ˜ 체크 포인트λ₯Ό μ €μž₯ν•˜κ³  λ¬Έμ œκ°€ λ°œμƒν•  λ•Œ λ³΅κ΅¬ν•˜λŠ”λ° μ‚¬μš©

  • LearningRateScheduler : μ΅œμ ν™”ν•˜λŠ” λ™μ•ˆ ν•™μŠ΅λ₯ μ„ λ™μ μœΌλ‘œ λ³€κ²½ν•  λ•Œ μ‚¬μš©

  • EarlyStopping : 검증 μ„±λŠ₯이 ν•œλ™μ•ˆ κ°œμ„ λ˜μ§€ μ•Šμ„ 경우 ν›ˆλ ¨μ„ 쀑단할 λ•Œ μ‚¬μš©

  • TensorBoard : ν…μ„œλ³΄λ“œλ₯Ό μ‚¬μš©ν•΄ λͺ¨λΈμ˜ 행동을 λͺ¨λ‹ˆν„°λ§ν•  λ•Œ μ‚¬μš©

λͺ¨λΈκ³Ό κ°€μ€‘μΉ˜ μ €μž₯

  • κ°€μ€‘μΉ˜λ₯Ό ν…μ„œν”Œλ‘œ 체크포인트 파일둜 μ €μž₯

model.save_weight('./weigth/model') # μ €μž₯
model.load_weight(file_path) # 볡원
  • κ°€μ€‘μΉ˜ μ΄μ™Έμ˜ λͺ¨λΈμ€ JSON ν˜•μ‹μœΌλ‘œ 직렬화 ν•  수 μžˆλ‹€

json_string = model.to_json() # μ €μž₯
model = tf.keras.model_from_json(json_string) # 볡원
  • YAML으둜 직렬화

yaml_string = model.to_yaml() # μ €μž₯
model = tf.keras.model_from_yaml(yaml_string) # 볡원
  • λͺ¨λΈμ„ κ°€μ€‘μΉ˜μ™€ μ΅œμ ν™” λ§€κ°œλ³€μˆ˜μ™€ ν•¨κ»˜ μ €μž₯

model.save('model.hs') # μ €μž₯
model = tf.keras.models.load_model('model.hs') # 볡원

데이터셋 라이브러리

  • 생성

    • from_tensor_slices() : κ°œλ³„(λ˜λŠ” 닀쀑) λ„˜νŒŒμ΄(λ˜λŠ” ν…μ„œ)λ₯Ό λ°›κ³  배치λ₯Ό 지원

    • from_tensor() : 1κ³Ό μœ μ‚¬ν•˜μ§€λ§Œ 배치λ₯Ό μ§€μ›ν•˜μ§€ μ•ŠλŠ”λ‹€

    • from_generator() : μƒμ„±μž ν•¨μˆ˜μ—μ„œ μž…λ ₯을 μ·¨ν•œλ‹€

  • λ³€ν™˜

    • batch() : 순차적으둜 데이터셋을 μ§€μ •ν•œ 크기둜 λΆ„ν• 

    • repeat() : 데이터λ₯Ό 볡제

    • shuffle() : 데이터λ₯Ό λ¬΄μž‘μœ„λ‘œ μ„žλŠ”λ‹€

    • map() : 데이터에 ν•¨μˆ˜λ₯Ό 적용

    • filter() : 데이터λ₯Ό κ±°λ₯΄κ³ μž ν•¨μˆ˜λ₯Ό μ μš”

  • 반볡

    • next_batch = iterator.get_next()

Previoustensorflow1xNextDB

Last updated 4 years ago

Was this helpful?

sequentialAPI
functionAPI