Second Patent Issued: What It Unlocks for RedEye
Second Patent Issued: What It Unlocks for RedEye
Published: August 15, 2025
Summary: Future Optek’s second patent, US 12,380,657 B1 (Advanced networking, detection, and data visualization techniques in multiple networked devices), was granted on August 5, 2025. It protects how mixed‑reality (MR) eyewear interoperates with networked drones and sensors to capture, fuse, and visualize data—including specific multi‑drone formations, modular MR hardware elements, and methods for sharing detections and maps between users and devices.
Today we’re excited to announce the issuance of our second U.S. patent: US 12,380,657 B1. While our first patent (US 11,922,586 B1) focuses on projecting an aiming reticle via compact AR optics, this new patent covers how RedEye computes and stabilizes that reticle by fusing the orientation of the shooter’s head and the firearm itself.
Why does this matter in the field? In traditional optics, your eye must align with the optic tube to see a useful aiming reference. RedEye turns that model inside out. The reticle lives in your natural field of view, and thanks to the new patented methods, it stays accurate as the weapon and head move—such as when shooting from unconventional positions or around barriers. The result is a faster, more intuitive sight picture and improved situational awareness.
The patent also addresses networking and multi‑device visualization. Instructors, team members, or integrated sensors (such as drones) can be authorized to share context relevant to a mission or training scenario. While these features are roadmap‑dependent and subject to policy, our architecture—and now our IP—are designed to support them.
This grant reflects years of R&D and field testing. It also protects our customers by ensuring that the core elements of RedEye’s user experience—a stable reticle that understands both the head and the barrel—are defensible as we bring the product to market.
We’ll post links to the Google Patents and USPTO entries as soon as they appear in public indices. In the meantime, you can find a plain‑English summary on our Technology page and subscribe for updates as we share engineering milestones.
What’s actually claimed
At a high level, Claim 1 recites a system that includes:
-
a series of networked drones,
-
a head‑mounted display (HMD) with a directional antenna and modular platform (see below), and
-
bidirectional sharing of sensing data between drones and one or more HMDs for visualization.
The HMD can mount auxiliary sensing components on an arc rail (e.g., IR, thermal, visible cameras, EMF detectors) and includes a waveguide‑based see‑through display, IMU near the eye, and data/power ports (e.g., USB‑C) governed by an open software architecture for MR operations.
Additional claims specify:
-
Using the antenna to help track multiple HMDs (Claim 2).
-
Toggling MR visualizations between structural maps and remotely sensed detections across two or more HMDs (Claim 3).
-
Drone formations: two triplets arranged to measure X, Y, Z axes for structural sensing (Claim 4), maintaining a fixed, proportionally scaling geometry with a master drone at the center forming a tetrahedron (Claim 5).
-
Edge computing at the master drone with optional backhaul to non‑local sites (Claim 6) and signature identification of pre‑selected targets (Claim 7).
-
An octahedral formation (inverted + regular triplets) suitable for Delaunay/Voronoi triangulation during scanning (Claim 8).
Modular MR hardware (headset)
The specification details a modular mixed‑reality eyewear platform: removeable arms with connectors for add‑on modules; USB‑C data/power; options for bridge‑mounted accessories; and an IMU placed close to the eye to improve head‑referenced orientation accuracy. This architecture allows field‑swappable sensors, radios, compute, and even front‑mounted protective visors or additional optics—without redesigning the core waveguide display.
Sensors & fusion
Beyond cameras, the patent contemplates multi‑spectral and RF sensing, including MIMO imaging radar (4D) for mapping and motion (Doppler) analysis, IR/thermal, NLOS techniques, and EMF detection. Data can be fused at the edge (e.g., on a master drone) and visualized in MR as overlays, 360° radar‑style maps, or relative‑position views with users at the center—supporting quick interpretation under stress.
Example scenarios
-
Search & Rescue / Public Safety: Multi‑drone lattice scans a structure; responders’ HMDs toggle between a structural model and live detections (heat signatures, motion) to accelerate room‑by‑room clears.
-
Construction / Inspection: Teams capture progress scans and anomalies via coordinated formations, sharing MR overlays on‑site and back to a remote office through the edge‑to‑cloud path.
-
Defense / Training: Units share friend/foe detections and navigate with MR HUDs while drones maintain an octahedral formation for volumetric sensing; instructors can view the same MR layers.
How 12,380,657 relates to RedEye and VAS
RedEye’s near‑term product experience—AR reticle and aiming—is covered by US 11,922,586 B1. US 12,380,657 B1 covers the networked MR foundation that enables future capabilities: team overlays, drone‑assisted scanning, local sensing add‑ons, and shared maps/detections between operators. It’s a broader IP umbrella that supports where RedEye is headed while preserving our freedom to integrate additional sensors and platforms over time.
We’ll keep this post updated with technical diagrams that mirror the patent’s figures (e.g., drone tetrahedron/octahedron formations) and will add performance clips as features graduate from R&D. For the full text and drawings, see the Google Patents entry for US 12,380,657 B1.