MODULAR ROBOTICS INFRASTRUCTURE

Aliensense RED

Meet the platform that transforms how your robots gain new capabilities.
Add compute, sensors, and AI-acceleration through plug-and-play modules.
No redesigns, only fast iterations.
ALIENSENSE RED
COMPUTE BASE
Jetson Orin Nano/NX carrier module with M.2/PCIe expansion for swappable adapter cards and CAN connectors for all your industrial robotics needs.
Built for the robots that actually ship.
Compute that grows with your design.
Vibe Engineering Done Right
Iterate Fast
M.2 and PCIe expansion let you add accelerators, storage, or I/O cards—upgrade compute or add features without rebuilding your boards.
Reliable Sync Across the Board
Fusion That Works
Hardware PPS timebase aligns timestamps across cameras, IMUs and LiDAR so sensor fusion works reliably out of the box.
Supercharge the Compute Speeds
Unleash the Compute
A dedicated MCU handles CAN, GPIO and high-rate IMU polling with sub‑millisecond timing, keeping Jetson focused on perception and planning.
Placement Freedom for Your Designs
Real-Time Anywhere
Mount sensors on arms, corners or gimbals without signal loss. EMI‑resistant GMSL2 carries data, power and sync up to 15 m with low latency.
ALIENSENSE RED
SENSOR ECOSYSTEM
Connect all your cameras, positioning devices and sensors with pre-integrated ROS-native support. Start collecting data in hours instead of engineering custom drivers.
Thalamus A adds 2× GMSL2 camera ports plus dual CAN-FD buses to your RED carrier without redesigning PCBs. A single module to enable synchronized stereo vision, robot bus communication, and PPS time domain sync.
Thalamus B slots underneath your RED carrier with 4× GMSL2 camera ports, enabling panoramic perception, redundant sensor setups, or high-resolution multi-camera arrays. Bottom-mounted design keeps M.2 slots accessible, so you can run perception and AI acceleration simultaneously.
Plug in stereo cameras for VSLAM, panoramic arrays for 360° perception, CAN buses for robot control, and M.2 AI accelerators for edge inference. All components sharing a unified PPS time domain and zero-copy ROS2 data paths. Sensor topologies conveniently configured via YAML.
From unboxing to streaming in hours, not weeks.
Sense everything. Configure nothing.
Extended Camera Connectivity
Have a Look Around
Process 4K30 streams from IMX334/678/900 sensors on Orin Nano. Ideal for your situational awareness kits and 360 sensing.
Hundreds of Devices with mikroBus
Less Downtime
mikroBUS Click Boards mount camera-side for IMU, GPS, pressure, compass. All sensor data tunnels through the same cable as video.
More Power Even for Starter Kits
Supercharge the Edge
Video flows straight from sensor to perception code with zero memory copies. Orin Nano handles 8 cameras at <5% CPU utilization.
Future-proof and Coder-friendly
ROS as a Boss
Data is timestamped and broadcasted to ROS2. Microcontroller handles time-critical sensors: IMU polling, CAN bus, GPIO events.
ALIENSENSE RED
AI ACCELERATION
Run vision models, transformers, and decision networks at the edge with ML-optimized silicon designed for autonomous perception.
Run concurrent inference and sensor fusion without offloading.
Working at the edge.
Parallel Multi-Network Execution
Your Multitasker
Execute detection, tracking, classification, and prediction networks simultaneously. Fuse multiple outputs without bottlenecking or throttling.
Power-Efficient Sensor Capabilities
Longer Autonomy
Run more sensor channels on the same power budget or extend battery life without sacrificing perception performance.
Robotics-first Thermal Management
Be Cool. Be Fanless.
Deploy in sealed enclosures without thermal throttling. No aggressive cooling, no design compromises - just inference in constrained spaces.
No Edge Inference Engine Lock-in
No lock-ins. Ever.
Framework-agnostic toolchain runs ONNX models without vendor dependencies. Diversify silicon and protect your existing investment.