08. Edge AI - TensorFlow Lite
08. Edge AI - TensorFlow Lite¶
ํ์ต ๋ชฉํ¶
- Edge AI ๊ฐ๋ ๊ณผ ์ฅ์ ์ดํด
- TensorFlow Lite ๊ฐ์ ํ์
- ๋ชจ๋ธ ๋ณํ (.tflite) ๋ฐฉ๋ฒ ํ์ต
- ๋ผ์ฆ๋ฒ ๋ฆฌํ์ด์์ ์ถ๋ก ์ํ
- ์ด๋ฏธ์ง ๋ถ๋ฅ ์์ ๊ตฌํ
1. Edge AI ๊ฐ๋ ¶
1.1 Edge AI๋?¶
Edge AI๋ ํด๋ผ์ฐ๋๊ฐ ์๋ ์ฃ์ง ๋๋ฐ์ด์ค(๋ผ์ฆ๋ฒ ๋ฆฌํ์ด, ์ค๋งํธํฐ ๋ฑ)์์ ์ง์ AI ์ถ๋ก ์ ์ํํ๋ ๊ฒ์ ๋๋ค.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ํด๋ผ์ฐ๋ AI vs Edge AI โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ ํด๋ผ์ฐ๋ AI Edge AI โ
โ โโโโโโโโโโโ โโโโโโโโโโโ โ
โ โ ์ผ์ โ โ ์ผ์ โ โ
โ โโโโโโฌโโโโโ โโโโโโฌโโโโโ โ
โ โ ๋ฐ์ดํฐ โ ๋ฐ์ดํฐ โ
โ โผ โผ โ
โ โโโโโโโโโโโ โโโโโโโโโโโ โ
โ โ ๋คํธ์ํฌโ โ Edge โ โ
โ โโโโโโฌโโโโโ โ AI โ โ
โ โ โโโโโโฌโโโโโ โ
โ โผ โ ๊ฒฐ๊ณผ โ
โ โโโโโโโโโโโ โผ โ
โ โ ํด๋ผ์ฐ๋โ โโโโโโโโโโโ โ
โ โ AI โ โ Action โ โ
โ โโโโโโฌโโโโโ โโโโโโโโโโโ โ
โ โ โ
โ โผ ์ฅ์ : โ
โ โโโโโโโโโโโ โข ์ ์ง์ฐ (< 50ms) โ
โ โ Action โ โข ์คํ๋ผ์ธ ๋์ โ
โ โโโโโโโโโโโ โข ํ๋ผ์ด๋ฒ์ โ
โ โข ๋น์ฉ ์ ๊ฐ โ
โ ๋จ์ : โ
โ โข ์ง์ฐ (100ms+) โ
โ โข ๋คํธ์ํฌ ์์กด โ
โ โข ๋ฐ์ดํฐ ์ ์ก ๋น์ฉ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
1.2 Edge AI ํ์ฉ ์ฌ๋ก¶
| ๋ถ์ผ | ํ์ฉ ์์ |
|---|---|
| ์ค๋งํธํ | ์ผ๊ตด ์ธ์ ๋์ด๋ฝ, ์์ฑ ์ธ์ |
| ์ฐ์ | ๋ถ๋ํ ๊ฒ์ถ, ์์ธก ์ ๋น |
| ํฌ์ค์ผ์ด | ์จ์ด๋ฌ๋ธ ๊ฑด๊ฐ ๋ชจ๋ํฐ๋ง |
| ๋์ | ์๋ฌผ ์ง๋ณ ๊ฐ์ง, ํด์ถฉ ์๋ณ |
| ์๋์ฐจ | ADAS, ๋ณดํ์ ๊ฐ์ง |
1.3 Edge AI ํ๋ ์์ํฌ ๋น๊ต¶
| ํ๋ ์์ํฌ | ๊ฐ๋ฐ์ฌ | ํน์ง | ํ๋์จ์ด ์ง์ |
|---|---|---|---|
| TensorFlow Lite | ๋ฒ์ฉ, ์ํ๊ณ ํ๋ถ | CPU, GPU, Edge TPU | |
| ONNX Runtime | Microsoft | ๋ค์ํ ํ๋ ์์ํฌ ํธํ | CPU, GPU |
| OpenVINO | Intel | Intel ์ต์ ํ | Intel CPU/GPU |
| TensorRT | NVIDIA | NVIDIA GPU ์ต์ ํ | NVIDIA GPU |
2. TensorFlow Lite ๊ฐ์¶
2.1 TFLite ํน์ง¶
# TensorFlow Lite ํน์ง
tflite_features = {
"๊ฒฝ๋ํ": "๋ชจ๋ธ ํฌ๊ธฐ ๊ฐ์ (์์ํ๋ก 1/4)",
"์ต์ ํ": "๋ชจ๋ฐ์ผ/์๋ฒ ๋๋ ์ถ๋ก ์ต์ ํ",
"ํ๋์จ์ด ๊ฐ์": "GPU, Edge TPU, DSP ์ง์",
"ํฌ๋ก์ค ํ๋ซํผ": "Android, iOS, Linux, MCU",
"์ฐ์ฐ์": "TF ์ฐ์ฐ์ ์๋ธ์
์ง์"
}
2.2 ๋ผ์ฆ๋ฒ ๋ฆฌํ์ด์์ ์ค์น¶
# ๋ฐฉ๋ฒ 1: tflite-runtime (๊ถ์ฅ, ๊ฒฝ๋)
pip install tflite-runtime
# ๋ฐฉ๋ฒ 2: ์ ์ฒด TensorFlow (๋ฌด๊ฑฐ์)
# pip install tensorflow
# ์ถ๊ฐ ํจํค์ง
pip install numpy pillow
2.3 TFLite ์ํฌํ๋ก์ฐ¶
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ TensorFlow Lite ์ํฌํ๋ก์ฐ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ 1. ๋ชจ๋ธ ํ๋ จ (PC/ํด๋ผ์ฐ๋) โ
โ โโโโโโโโโโโโโโโโโโโโโโ โ
โ โ TensorFlow/Keras โ โ
โ โ ๋ชจ๋ธ (.h5) โ โ
โ โโโโโโโโโโโฌโโโโโโโโโโโ โ
โ โ โ
โ 2. ๋ชจ๋ธ ๋ณํ โ
โ โโโโโโโโโโโผโโโโโโโโโโโ โ
โ โ TFLite Converter โ โ
โ โ (์์ํ ์ต์
) โ โ
โ โโโโโโโโโโโฌโโโโโโโโโโโ โ
โ โ โ
โ 3. ์ต์ ํ๋ ๋ชจ๋ธ โ
โ โโโโโโโโโโโผโโโโโโโโโโโ โ
โ โ model.tflite โ โ
โ โ (๊ฒฝ๋ํ๋ ๋ชจ๋ธ) โ โ
โ โโโโโโโโโโโฌโโโโโโโโโโโ โ
โ โ โ
โ 4. ์ฃ์ง ๋ฐฐํฌ โ
โ โโโโโโโโโโโผโโโโโโโโโโโ โ
โ โ TFLite Runtime โ โ
โ โ (๋ผ์ฆ๋ฒ ๋ฆฌํ์ด) โ โ
โ โโโโโโโโโโโโโโโโโโโโโโ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
3. ๋ชจ๋ธ ๋ณํ (.tflite)¶
3.1 ๊ธฐ๋ณธ ๋ณํ¶
#!/usr/bin/env python3
"""TensorFlow ๋ชจ๋ธ์ TFLite๋ก ๋ณํ"""
import tensorflow as tf
# ๊ธฐ์กด Keras ๋ชจ๋ธ ๋ก๋
model = tf.keras.models.load_model('my_model.h5')
# ๋ณํ๊ธฐ ์์ฑ
converter = tf.lite.TFLiteConverter.from_keras_model(model)
# ๋ณํ
tflite_model = converter.convert()
# ์ ์ฅ
with open('model.tflite', 'wb') as f:
f.write(tflite_model)
print(f"๋ชจ๋ธ ํฌ๊ธฐ: {len(tflite_model) / 1024:.2f} KB")
3.2 ์์ํ (Quantization)¶
#!/usr/bin/env python3
"""์์ํ๋ฅผ ํตํ ๋ชจ๋ธ ์ต์ ํ"""
import tensorflow as tf
import numpy as np
def load_model():
return tf.keras.models.load_model('my_model.h5')
def convert_to_tflite(model, quantization='none'):
"""
์์ํ ์ต์
:
- 'none': ๊ธฐ๋ณธ (float32)
- 'dynamic': ๋์ ๋ฒ์ ์์ํ (๊ฐ์ค์น๋ง)
- 'float16': Float16 ์์ํ
- 'int8': ์ ์ฒด ์ ์ ์์ํ (๋ํ ๋ฐ์ดํฐ์
ํ์)
"""
converter = tf.lite.TFLiteConverter.from_keras_model(model)
if quantization == 'dynamic':
# ๋์ ๋ฒ์ ์์ํ (๊ฐ์ฅ ์ฌ์)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
elif quantization == 'float16':
# Float16 ์์ํ
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
elif quantization == 'int8':
# ์ ์ฒด ์ ์ ์์ํ (์ต๋ ์ต์ ํ)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS_INT8
]
converter.inference_input_type = tf.int8
converter.inference_output_type = tf.int8
# ๋ํ ๋ฐ์ดํฐ์
์ ๊ณต ํ์
def representative_dataset():
for _ in range(100):
yield [np.random.rand(1, 224, 224, 3).astype(np.float32)]
converter.representative_dataset = representative_dataset
tflite_model = converter.convert()
return tflite_model
# ๋ณํ ๋ฐ ํฌ๊ธฐ ๋น๊ต
model = load_model()
for quant in ['none', 'dynamic', 'float16']:
tflite_model = convert_to_tflite(model, quant)
size_kb = len(tflite_model) / 1024
with open(f'model_{quant}.tflite', 'wb') as f:
f.write(tflite_model)
print(f"{quant}: {size_kb:.2f} KB")
3.3 ์ฌ์ ํ๋ จ ๋ชจ๋ธ ๋ณํ¶
#!/usr/bin/env python3
"""MobileNet์ TFLite๋ก ๋ณํ"""
import tensorflow as tf
# MobileNetV2 ๋ก๋
model = tf.keras.applications.MobileNetV2(
input_shape=(224, 224, 3),
weights='imagenet',
include_top=True
)
# ๋ณํ
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tflite_model = converter.convert()
with open('mobilenet_v2.tflite', 'wb') as f:
f.write(tflite_model)
print(f"๋ณํ ์๋ฃ: {len(tflite_model) / (1024*1024):.2f} MB")
4. ๋ผ์ฆ๋ฒ ๋ฆฌํ์ด์์ ์ถ๋ก ¶
4.1 TFLite ์ธํฐํ๋ฆฌํฐ ๊ธฐ๋ณธ¶
#!/usr/bin/env python3
"""TFLite ์ถ๋ก ๊ธฐ๋ณธ"""
import numpy as np
# tflite-runtime ์ฌ์ฉ (๋ผ์ฆ๋ฒ ๋ฆฌํ์ด)
try:
from tflite_runtime.interpreter import Interpreter
except ImportError:
from tensorflow.lite.python.interpreter import Interpreter
class TFLiteModel:
"""TFLite ๋ชจ๋ธ ๋ํผ"""
def __init__(self, model_path: str):
self.interpreter = Interpreter(model_path=model_path)
self.interpreter.allocate_tensors()
# ์
์ถ๋ ฅ ์ ๋ณด
self.input_details = self.interpreter.get_input_details()
self.output_details = self.interpreter.get_output_details()
# ์
๋ ฅ ํํ
self.input_shape = self.input_details[0]['shape']
self.input_dtype = self.input_details[0]['dtype']
def get_input_shape(self):
"""์
๋ ฅ ํํ ๋ฐํ"""
return self.input_shape
def predict(self, input_data: np.ndarray) -> np.ndarray:
"""์ถ๋ก ์ํ"""
# ์
๋ ฅ ์ค์
self.interpreter.set_tensor(
self.input_details[0]['index'],
input_data.astype(self.input_dtype)
)
# ์ถ๋ก
self.interpreter.invoke()
# ์ถ๋ ฅ ๊ฐ์ ธ์ค๊ธฐ
output = self.interpreter.get_tensor(
self.output_details[0]['index']
)
return output
# ์ฌ์ฉ ์
if __name__ == "__main__":
model = TFLiteModel("model.tflite")
print(f"์
๋ ฅ ํํ: {model.get_input_shape()}")
# ๋๋ฏธ ์
๋ ฅ
input_data = np.random.rand(*model.get_input_shape()).astype(np.float32)
output = model.predict(input_data)
print(f"์ถ๋ ฅ ํํ: {output.shape}")
4.2 ์ฑ๋ฅ ์ธก์ ¶
#!/usr/bin/env python3
"""TFLite ์ถ๋ก ์ฑ๋ฅ ์ธก์ """
import numpy as np
import time
try:
from tflite_runtime.interpreter import Interpreter
except ImportError:
from tensorflow.lite.python.interpreter import Interpreter
def benchmark_model(model_path: str, num_runs: int = 100):
"""๋ชจ๋ธ ์ฑ๋ฅ ๋ฒค์น๋งํฌ"""
interpreter = Interpreter(model_path=model_path)
interpreter.allocate_tensors()
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
input_shape = input_details[0]['shape']
input_dtype = input_details[0]['dtype']
# ์๋ฐ์
dummy_input = np.random.rand(*input_shape).astype(input_dtype)
interpreter.set_tensor(input_details[0]['index'], dummy_input)
interpreter.invoke()
# ๋ฒค์น๋งํฌ
times = []
for _ in range(num_runs):
start = time.perf_counter()
interpreter.set_tensor(input_details[0]['index'], dummy_input)
interpreter.invoke()
_ = interpreter.get_tensor(output_details[0]['index'])
end = time.perf_counter()
times.append((end - start) * 1000) # ms
avg_time = np.mean(times)
std_time = np.std(times)
fps = 1000 / avg_time
print(f"=== {model_path} ===")
print(f"ํ๊ท ์ถ๋ก ์๊ฐ: {avg_time:.2f} ms (+/- {std_time:.2f})")
print(f"FPS: {fps:.1f}")
print(f"์
๋ ฅ ํํ: {input_shape}")
return avg_time
# ์ฌ๋ฌ ๋ชจ๋ธ ๋น๊ต
if __name__ == "__main__":
models = [
"model_none.tflite",
"model_dynamic.tflite",
"model_float16.tflite"
]
for model in models:
try:
benchmark_model(model)
print()
except Exception as e:
print(f"{model}: ์ค๋ฅ - {e}")
5. ์ด๋ฏธ์ง ๋ถ๋ฅ ์์ ¶
5.1 ImageNet ๋ถ๋ฅ¶
#!/usr/bin/env python3
"""TFLite ์ด๋ฏธ์ง ๋ถ๋ฅ (MobileNet)"""
import numpy as np
from PIL import Image
import time
try:
from tflite_runtime.interpreter import Interpreter
except ImportError:
from tensorflow.lite.python.interpreter import Interpreter
class ImageClassifier:
"""TFLite ์ด๋ฏธ์ง ๋ถ๋ฅ๊ธฐ"""
def __init__(self, model_path: str, labels_path: str = None):
self.interpreter = Interpreter(model_path=model_path)
self.interpreter.allocate_tensors()
self.input_details = self.interpreter.get_input_details()
self.output_details = self.interpreter.get_output_details()
# ์
๋ ฅ ํฌ๊ธฐ ํ์ธ
self.input_height = self.input_details[0]['shape'][1]
self.input_width = self.input_details[0]['shape'][2]
# ๋ผ๋ฒจ ๋ก๋
self.labels = []
if labels_path:
with open(labels_path, 'r') as f:
self.labels = [line.strip() for line in f.readlines()]
def preprocess(self, image: Image.Image) -> np.ndarray:
"""์ด๋ฏธ์ง ์ ์ฒ๋ฆฌ"""
# ๋ฆฌ์ฌ์ด์ฆ
image = image.resize((self.input_width, self.input_height))
# NumPy ๋ฐฐ์ด๋ก ๋ณํ
input_data = np.array(image, dtype=np.float32)
# ์ ๊ทํ (-1 ~ 1)
input_data = (input_data - 127.5) / 127.5
# ๋ฐฐ์น ์ฐจ์ ์ถ๊ฐ
input_data = np.expand_dims(input_data, axis=0)
return input_data
def classify(self, image_path: str, top_k: int = 5) -> list:
"""์ด๋ฏธ์ง ๋ถ๋ฅ"""
# ์ด๋ฏธ์ง ๋ก๋ ๋ฐ ์ ์ฒ๋ฆฌ
image = Image.open(image_path).convert('RGB')
input_data = self.preprocess(image)
# ์ถ๋ก
start = time.perf_counter()
self.interpreter.set_tensor(
self.input_details[0]['index'],
input_data
)
self.interpreter.invoke()
output = self.interpreter.get_tensor(
self.output_details[0]['index']
)[0]
inference_time = (time.perf_counter() - start) * 1000
# Top-K ๊ฒฐ๊ณผ
top_indices = output.argsort()[-top_k:][::-1]
results = []
for idx in top_indices:
label = self.labels[idx] if idx < len(self.labels) else f"class_{idx}"
score = float(output[idx])
results.append({
"class_id": int(idx),
"label": label,
"score": score
})
return {
"results": results,
"inference_time_ms": inference_time
}
# ์ฌ์ฉ ์
if __name__ == "__main__":
classifier = ImageClassifier(
model_path="mobilenet_v2.tflite",
labels_path="imagenet_labels.txt"
)
result = classifier.classify("test_image.jpg")
print(f"์ถ๋ก ์๊ฐ: {result['inference_time_ms']:.2f} ms")
print("\n๋ถ๋ฅ ๊ฒฐ๊ณผ:")
for r in result['results']:
print(f" {r['label']}: {r['score']:.4f}")
5.2 ์ค์๊ฐ ์นด๋ฉ๋ผ ๋ถ๋ฅ¶
#!/usr/bin/env python3
"""Pi Camera๋ฅผ ์ด์ฉํ ์ค์๊ฐ ์ด๋ฏธ์ง ๋ถ๋ฅ"""
import numpy as np
from PIL import Image
import time
import io
try:
from tflite_runtime.interpreter import Interpreter
except ImportError:
from tensorflow.lite.python.interpreter import Interpreter
try:
from picamera2 import Picamera2
HAS_CAMERA = True
except ImportError:
HAS_CAMERA = False
print("picamera2 ์์: ์๋ฎฌ๋ ์ด์
๋ชจ๋")
class RealtimeClassifier:
"""์ค์๊ฐ ์ด๋ฏธ์ง ๋ถ๋ฅ๊ธฐ"""
def __init__(self, model_path: str, labels_path: str):
# ๋ชจ๋ธ ๋ก๋
self.interpreter = Interpreter(model_path=model_path)
self.interpreter.allocate_tensors()
self.input_details = self.interpreter.get_input_details()
self.output_details = self.interpreter.get_output_details()
self.input_height = self.input_details[0]['shape'][1]
self.input_width = self.input_details[0]['shape'][2]
# ๋ผ๋ฒจ
with open(labels_path, 'r') as f:
self.labels = [line.strip() for line in f.readlines()]
# ์นด๋ฉ๋ผ ์ด๊ธฐํ
if HAS_CAMERA:
self.camera = Picamera2()
config = self.camera.create_preview_configuration(
main={"size": (640, 480), "format": "RGB888"}
)
self.camera.configure(config)
def preprocess(self, frame: np.ndarray) -> np.ndarray:
"""ํ๋ ์ ์ ์ฒ๋ฆฌ"""
image = Image.fromarray(frame)
image = image.resize((self.input_width, self.input_height))
input_data = np.array(image, dtype=np.float32)
input_data = (input_data - 127.5) / 127.5
input_data = np.expand_dims(input_data, axis=0)
return input_data
def classify_frame(self, frame: np.ndarray) -> dict:
"""๋จ์ผ ํ๋ ์ ๋ถ๋ฅ"""
input_data = self.preprocess(frame)
self.interpreter.set_tensor(
self.input_details[0]['index'],
input_data
)
self.interpreter.invoke()
output = self.interpreter.get_tensor(
self.output_details[0]['index']
)[0]
top_idx = output.argmax()
return {
"label": self.labels[top_idx] if top_idx < len(self.labels) else "unknown",
"score": float(output[top_idx])
}
def run(self, duration: float = 30):
"""์ค์๊ฐ ๋ถ๋ฅ ์คํ"""
if not HAS_CAMERA:
print("์นด๋ฉ๋ผ ์์")
return
self.camera.start()
print(f"์ค์๊ฐ ๋ถ๋ฅ ์์ ({duration}์ด)")
start_time = time.time()
frame_count = 0
try:
while time.time() - start_time < duration:
frame = self.camera.capture_array()
result = self.classify_frame(frame)
frame_count += 1
print(f"\r[{frame_count}] {result['label']}: {result['score']:.2f}", end="")
except KeyboardInterrupt:
pass
finally:
self.camera.stop()
elapsed = time.time() - start_time
fps = frame_count / elapsed
print(f"\n\nFPS: {fps:.1f}")
if __name__ == "__main__":
classifier = RealtimeClassifier(
model_path="mobilenet_v2.tflite",
labels_path="imagenet_labels.txt"
)
classifier.run(duration=60)
5.3 IoT ํตํฉ ์์ ¶
#!/usr/bin/env python3
"""TFLite + MQTT: ์ด๋ฏธ์ง ๋ถ๋ฅ ๊ฒฐ๊ณผ ๋ฐํ"""
import numpy as np
from PIL import Image
import json
import time
import paho.mqtt.client as mqtt
try:
from tflite_runtime.interpreter import Interpreter
except ImportError:
from tensorflow.lite.python.interpreter import Interpreter
class AIEdgeNode:
"""AI ์ฃ์ง ๋
ธ๋"""
def __init__(self, model_path: str, labels_path: str,
mqtt_broker: str = "localhost"):
# TFLite ๋ชจ๋ธ
self.interpreter = Interpreter(model_path=model_path)
self.interpreter.allocate_tensors()
self.input_details = self.interpreter.get_input_details()
self.output_details = self.interpreter.get_output_details()
self.input_shape = self.input_details[0]['shape']
with open(labels_path, 'r') as f:
self.labels = [line.strip() for line in f.readlines()]
# MQTT ํด๋ผ์ด์ธํธ
self.mqtt_client = mqtt.Client()
self.mqtt_client.connect(mqtt_broker, 1883)
self.mqtt_client.loop_start()
self.node_id = "edge_ai_01"
def classify_and_publish(self, image_path: str):
"""๋ถ๋ฅํ๊ณ ๊ฒฐ๊ณผ๋ฅผ MQTT๋ก ๋ฐํ"""
# ์ด๋ฏธ์ง ๋ก๋ ๋ฐ ์ ์ฒ๋ฆฌ
image = Image.open(image_path).convert('RGB')
image = image.resize((self.input_shape[2], self.input_shape[1]))
input_data = np.array(image, dtype=np.float32)
input_data = (input_data - 127.5) / 127.5
input_data = np.expand_dims(input_data, axis=0)
# ์ถ๋ก
start = time.perf_counter()
self.interpreter.set_tensor(self.input_details[0]['index'], input_data)
self.interpreter.invoke()
output = self.interpreter.get_tensor(self.output_details[0]['index'])[0]
inference_time = (time.perf_counter() - start) * 1000
# ๊ฒฐ๊ณผ ์์ฑ
top_idx = output.argsort()[-3:][::-1]
predictions = [
{
"label": self.labels[idx] if idx < len(self.labels) else "unknown",
"score": float(output[idx])
}
for idx in top_idx
]
result = {
"node_id": self.node_id,
"image": image_path,
"predictions": predictions,
"inference_time_ms": round(inference_time, 2),
"timestamp": time.time()
}
# MQTT ๋ฐํ
topic = f"edge/{self.node_id}/classification"
self.mqtt_client.publish(topic, json.dumps(result))
print(f"๋ฐํ: {topic}")
print(f" Top-1: {predictions[0]['label']} ({predictions[0]['score']:.2f})")
return result
def shutdown(self):
"""์ ๋ฆฌ"""
self.mqtt_client.loop_stop()
self.mqtt_client.disconnect()
# ์ฌ์ฉ ์
if __name__ == "__main__":
node = AIEdgeNode(
model_path="mobilenet_v2.tflite",
labels_path="imagenet_labels.txt",
mqtt_broker="localhost"
)
try:
# ํ
์คํธ ์ด๋ฏธ์ง ๋ถ๋ฅ
node.classify_and_publish("test_image.jpg")
finally:
node.shutdown()
์ฐ์ต ๋ฌธ์ ¶
๋ฌธ์ 1: ๋ชจ๋ธ ๋ณํ¶
- Keras ๋ชจ๋ธ์ TFLite๋ก ๋ณํํ์ธ์.
- ๋์ ์์ํ๋ฅผ ์ ์ฉํ๊ณ ํฌ๊ธฐ๋ฅผ ๋น๊ตํ์ธ์.
๋ฌธ์ 2: ์ฑ๋ฅ ์ต์ ํ¶
- ๋์ผ ๋ชจ๋ธ์ FP32, FP16, INT8 ๋ฒ์ ์ฑ๋ฅ์ ๋น๊ตํ์ธ์.
- ๋ผ์ฆ๋ฒ ๋ฆฌํ์ด์์ FPS๋ฅผ ์ธก์ ํ์ธ์.
๋ฌธ์ 3: ์ค์๊ฐ ๋ถ๋ฅ¶
- Pi Camera๋ก ์ค์๊ฐ ์ด๋ฏธ์ง ๋ถ๋ฅ๋ฅผ ๊ตฌํํ์ธ์.
- ๋ถ๋ฅ ๊ฒฐ๊ณผ๋ฅผ MQTT๋ก ๋ฐํํ์ธ์.
๋ค์ ๋จ๊ณ¶
- 09_Edge_AI_ONNX.md: ONNX Runtime์ผ๋ก Edge AI
- 11_Image_Analysis_Project.md: ์์ ๋ถ์ ํ๋ก์ ํธ
์ต์ข ์ ๋ฐ์ดํธ: 2026-02-01