
Sight beyond Sight
Xikaku builds the sensors and software that let autonomous systems understand where they are, how they are moving, and what is around them.
From high-precision IMUs to real-time sensor fusion, our technology powers autonomous vehicles, robotics, AR/VR, industrial motion capture, and maritime platforms around the world.
Explore our sensors
The Xikaku LPMS product line covers the full range of inertial measurement â from miniature OEM boards to rugged IP67 navigation-grade units with integrated GNSS. Click any product for full specifications.
LPMS-B2 Series â 3D IMU / AHRS · Wireless Bluetooth 2.1 / BLE · Internal flash memory recording | LPMS-U3 Series â 3D IMU / AHRS · USB, CAN Bus, RS232 or TTL · Compact plastic enclosure |
LPMS-AL3 Series â 3D IMU / AHRS · IP67 waterproof housing · CAN, RS232, TTL or USB | LPMS-IG1 Series â 9-axis IMU / AHRS · Dual gyroscopes for range flexibility · USB / CAN / RS232 / RS485 |
LPMS-IG1P Series â 9-axis IMU / AHRS · Integrated GPS and RTK-GPS · Dead-reckoning for localization | LPMS-IG1W Series â 9-axis IMU / AHRS · Wi-Fi for industrial IoT · Dual gyroscopes, IP67 |
LPMS-NAV3 Series â 6-axis IMU / AHRS · High-accuracy heading gyroscope · Automotive and AGV navigation | LPMS-CURS3 Series â PCB-level 9-axis IMU / AHRS · USB, CAN and UART · OEM integration, 22 à 28 à 7.65 mm |
LPMS-INC1 Series â Dual-axis MEMS inclinometer · High-precision tilt measurement · RS232 / CAN, IP67 housing |
See the full Sensors page for complete specifications and datasheets.
FusionHub - real-time sensor fusion

FusionHub is our sensor fusion engine for applications where a single sensor is never enough. It ingests data from IMUs, GNSS/RTK receivers, optical trackers, wheel odometry, CAN buses, and VR headsets, and fuses them through Unscented Kalman Filters tuned for real-world noise, dropouts, and latency.
Pipelines are built as node graphs in a browser-based editor, configured with JSON, and hot-reloaded without restarts. The same engine drives a tracked vehicle, a mocap stage, an AR headset, or an AGV fleet.
FusionHub is also the layer that connects AI to the physical world. Through a built-in MCP server, language models and agents can query live pose, motion, and navigation state â giving embodied AI, robotics, and autonomous systems the real-time ground truth they need to reason and act.
Key capabilities
- 6-DoF IMU + optical fusion with automatic intercalibration
- 20-state GNSS + IMU fusion with online bias and antenna-offset estimation
- Odometry + IMU dead-reckoning and full vehicular fusion for automotive workloads
- Visual node editor, live dashboard, 3D and map views, cloud workspace sync
- Embeddable via C FFI; runs on Windows and Linux, on the desk or headless at the edge
See the full FusionHub page for architecture details, supported sensors, and deployment examples.
Use cases
Our technology has been deployed in autonomous-driving trials, in-flight pilot training, consumer VR hardware, and industrial automation. Three recent examples:
AD / ADAS Vehicle-in-the-Loop
A physical vehicle drives a real test track while interacting with a dynamic virtual environment in real time. Built in collaboration with IPG Automotive, FusionHub synchronises live vehicle dynamics with CarMaker scenarios so AD/ADAS systems can be validated safely and repeatably. Read more â
VR Pilot Training in a Helicopter
Canada’s National Research Council uses LPVR-DUO to train helicopter pilots for Ship Helicopter Operating Limitations (SHOL) in actual flight. The system keeps head tracking and VR overlays stable despite the extreme accelerations, vibrations, and magnetic disturbances of a live aircraft. Read more â
Sub-centimeter SLAM on Meta Quest 3
Our Full Fusion pipeline combines visual SLAM (ZED Mini stereo camera) with an LPMS-CURS3 IMU to deliver room-scale 6-DoF tracking on a Meta Quest 3. Validated against ART Smarttrack 3 as ground truth, achieving sub-centimeter position accuracy and 0.45° rotation error. Read more â
Wearable IMU motion capture
LPMOCAP captures full upper-body motion using only a small set of wearable IMUs â head, chest, and both arms â with no studio, markers, or external cameras. At 50 Hz updates with sub-0.01° resolution, it runs in ergonomics analysis, injury rehabilitation, and strain prevention where optical mocap is impractical. A clear demonstration of how far inertial-only motion reconstruction can be pushed with the right sensor-fusion stack. Read more â
See the full Use cases page for more deployments.
Our story
Xikaku is an American company headquartered in Venice, California, and the commercial arm of LP-Research, Inc., our Tokyo-founded sensor and sensor-fusion R&D group. Together we turn two decades of motion-sensing research into products that ship worldwide across automotive, aerospace, robotics, industrial, and AR/VR.
Read the full Our story - from a Tokyo robotics lab to LP-Research, LPVR, and Xikaku in Venice, California.
Contact
Xikaku, Inc.
4223 Glencoe Ave, Suite C215
Marina Del Rey, CA 90292
info@xikaku.com