Klepsydra AI Inference Add-On

High performance deep neural network (DNN) engine for space craft computers to deploy any AI or machine learning model on any type of processor, including CPU-only.

Klepsydra in-the-Loop

Allows to include hardware performance into model training and optimise for performance parameters such as power consumption, latency, RAM etc become part of the training data, thus adding another dimension to AI@Edge training

Requires Klepsydra High-Performance Software Framework to run

Specifications
Lead Time:
2 to 3 weeks
Operating System:
Linux
Operating System:
FreeRTOS
Operating System:
PikeOS
Supported Processors:
ARMv8
Supported Processors:
ARM Cortex A7
Supported Processors:
ARM Cortex A9
Minimum RAM Usage:
250 Mb
Disk Space Usage:
10 Mb
Supported Languages:
C++
Supported Languages:
C
Supported Languages:
NodeJS
Supported Languages:
Python
Supported Model Format:
ONNX
Supported Model Format:
TensorFlow
Supported Model Format:
Caffe
Supported Data Format:
Float matrices
Supported Data Format:
time series
Supported Data Format:
OpenCV matrix objects (cvMat)
Documents