Xikaku Xikaku

Xikaku Xikaku United States

Sight beyond Sight

Xikaku builds the sensors and software that let autonomous systems understand where they are, how they are moving, and what is around them.

From high-precision IMUs to real-time sensor fusion, our technology powers autonomous vehicles, robotics, AR/VR, industrial motion capture, and maritime platforms around the world.

Explore our sensors

The Xikaku LPMS product line covers the full range of inertial measurement — from miniature OEM boards to rugged IP67 navigation-grade units with integrated GNSS. Click any product for full specifications.

LPMS-B2 Series LPMS-B2 Series — 3D IMU / AHRS · Wireless Bluetooth 2.1 / BLE · Internal flash memory recordingLPMS-U3 Series LPMS-U3 Series — 3D IMU / AHRS · USB, CAN Bus, RS232 or TTL · Compact plastic enclosure
LPMS-AL3 Series LPMS-AL3 Series — 3D IMU / AHRS · IP67 waterproof housing · CAN, RS232, TTL or USBLPMS-IG1 Series LPMS-IG1 Series — 9-axis IMU / AHRS · Dual gyroscopes for range flexibility · USB / CAN / RS232 / RS485
LPMS-IG1P Series LPMS-IG1P Series — 9-axis IMU / AHRS · Integrated GPS and RTK-GPS · Dead-reckoning for localizationLPMS-IG1W Series LPMS-IG1W Series — 9-axis IMU / AHRS · Wi-Fi for industrial IoT · Dual gyroscopes, IP67
LPMS-NAV3 Series LPMS-NAV3 Series — 6-axis IMU / AHRS · High-accuracy heading gyroscope · Automotive and AGV navigationLPMS-CURS3 Series LPMS-CURS3 Series — PCB-level 9-axis IMU / AHRS · USB, CAN and UART · OEM integration, 22 × 28 × 7.65 mm
LPMS-INC1 Series LPMS-INC1 Series — Dual-axis MEMS inclinometer · High-precision tilt measurement · RS232 / CAN, IP67 housing

See the full Sensors page for complete specifications and datasheets.

FusionHub - real-time sensor fusion

FusionHub is our sensor fusion engine for applications where a single sensor is never enough. It ingests data from IMUs, GNSS/RTK receivers, optical trackers, wheel odometry, CAN buses, and VR headsets, and fuses them through Unscented Kalman Filters tuned for real-world noise, dropouts, and latency.

Pipelines are built as node graphs in a browser-based editor, configured with JSON, and hot-reloaded without restarts. The same engine drives a tracked vehicle, a mocap stage, an AR headset, or an AGV fleet.

FusionHub is also the layer that connects AI to the physical world. Through a built-in MCP server, language models and agents can query live pose, motion, and navigation state — giving embodied AI, robotics, and autonomous systems the real-time ground truth they need to reason and act.

Key capabilities

See the full FusionHub page for architecture details, supported sensors, and deployment examples.

Use cases

Our technology has been deployed in autonomous-driving trials, in-flight pilot training, consumer VR hardware, and industrial automation. Three recent examples:

AD / ADAS Vehicle-in-the-Loop

AD/ADAS Vehicle-in-the-Loop

A physical vehicle drives a real test track while interacting with a dynamic virtual environment in real time. Built in collaboration with IPG Automotive, FusionHub synchronises live vehicle dynamics with CarMaker scenarios so AD/ADAS systems can be validated safely and repeatably. Read more →

VR Pilot Training in a Helicopter

LPVR-DUO in Helicopter

Canada’s National Research Council uses LPVR-DUO to train helicopter pilots for Ship Helicopter Operating Limitations (SHOL) in actual flight. The system keeps head tracking and VR overlays stable despite the extreme accelerations, vibrations, and magnetic disturbances of a live aircraft. Read more →

Sub-centimeter SLAM on Meta Quest 3

Full Fusion SLAM on Quest 3

Our Full Fusion pipeline combines visual SLAM (ZED Mini stereo camera) with an LPMS-CURS3 IMU to deliver room-scale 6-DoF tracking on a Meta Quest 3. Validated against ART Smarttrack 3 as ground truth, achieving sub-centimeter position accuracy and 0.45° rotation error. Read more →

Wearable IMU motion capture

LPMOCAP demo

LPMOCAP captures full upper-body motion using only a small set of wearable IMUs — head, chest, and both arms — with no studio, markers, or external cameras. At 50 Hz updates with sub-0.01° resolution, it runs in ergonomics analysis, injury rehabilitation, and strain prevention where optical mocap is impractical. A clear demonstration of how far inertial-only motion reconstruction can be pushed with the right sensor-fusion stack. Read more →

See the full Use cases page for more deployments.

Our story

Xikaku is an American company headquartered in Venice, California, and the commercial arm of LP-Research, Inc., our Tokyo-founded sensor and sensor-fusion R&D group. Together we turn two decades of motion-sensing research into products that ship worldwide across automotive, aerospace, robotics, industrial, and AR/VR.

Read the full Our story - from a Tokyo robotics lab to LP-Research, LPVR, and Xikaku in Venice, California.

Contact

Xikaku, Inc.
4223 Glencoe Ave, Suite C215
Marina Del Rey, CA 90292
info@xikaku.com

Your cart

Subtotal$0.00
Tax + shipping calculated at checkout.

Page history

Asset library