What's new

ASUS GT-BE96_AI Concerns: Synaptics SL1680 NPU Incompatibility (Ollama & Frigate)

Rabbit-Spec

New Around Here
Hi everyone,

I recently upgraded to the ASUS GT-BE96_AI, the current flagship in the ASUS router lineup. While the networking side (Wi-Fi 7) is impressive, I’m finding the "AI Board" platform to be quite disappointing due to significant compatibility issues with its Synaptics SL1680 SoC.

Despite the marketing focus on its 7.9 TOPS NPU, the software ecosystem seems extremely closed, making it nearly impossible for enthusiasts to utilize that power. Here are the specific issues I've encountered:


1. Ollama / LLM Inference Issue: I tried
ee6a0be4b49222bd05ad2fd55f8f4dc7.PNG
deploying Ollama on the AI Board. As it turns out, Ollama (and llama.cpp) cannot utilize the Synaptics NPU at all. It falls back entirely to the Quad-core A73 CPU, which immediately hits 100% load.


The root cause seems to be the proprietary Synaptics SyNAP driver framework. Since popular inference engines don't support SyNAP natively, this NPU is essentially "dead weight" for local LLMs.





2. Frigate NPU Acceleration: Even the native Frigate container provided within the ASUS AI Board interface fails to utilize the NPU for object detection. It appears the integration between the containerized environment and the SyNAP hardware layer is either broken or incomplete.

3. The Choice of SoC (SL1680 vs. RK3588): In hindsight, if ASUS had chosen the Rockchip RK3588, the community support (via rknpu) would have allowed seamless integration with Frigate, Home Assistant, and various LLM backends. By using the SL1680, we are stuck in a "walled garden" dependent on Synaptics' niche SDK.

My Questions to the Community:
Is there any word from ASUS or Synaptics regarding more open drivers or a more robust Docker integration for the NPU?

If we can't even run small models (1B-3B) with NPU acceleration, the "AI" branding on this router feels like a missed opportunity for the power-user community.

Looking forward to any insights or workarounds!
 
s there any word from ASUS or Synaptics regarding more open drivers or a more robust Docker integration for the NPU?
Asus hasn't shared anything outside of their own FAQs. I recommend you contact them directly regarding this. For instance, Frigate is supposed to be leveraging the NPU for image recognition - if it doesn't then it should be reported as a bug.
 
Asus hasn't shared anything outside of their own FAQs. I recommend you contact them directly regarding this. For instance, Frigate is supposed to be leveraging the NPU for image recognition - if it doesn't then it should be reported as a bug.
Thanks for your reply!

I have already reported these issues to ASUS technical support. The main reason I’m posting here is that there is currently a significant lack of community discussion and documentation regarding the ASUS AI Board and its underlying hardware.

As you mentioned, if even the pre-installed Frigate container isn't leveraging the NPU correctly, it’s a major oversight. I’m hoping to bring more attention to these "black box" hardware issues by sharing them here.
 
Listen guys, don't rush into the Wi-Fi 7 (or future Wi-Fi 8) hype just yet. Being an early adopter often means you're just a high-paying beta tester for the manufacturers.

The latest flagships like the GT-BE96 are suffering from major software gaps—NPUs that don't work, driver compatibility issues, and unpredictable thermal spikes. You're paying premium prices just to deal with bugs that ruin your experience.

Stick with high-end Wi-Fi 6 / 6E models (like the GT-AX11000 Pro or RT-AX86U Pro). The firmware is rock-solid, the drivers are mature, and the performance is consistent. Don't waste your money and sanity on 'bleeding-edge' tech that isn't ready for prime time. Stability is true luxury.
 
Are you guys New Around Here twins or something? GT-BE19000AI is the only ASUS AI model I can find.
Haha, not twins! I actually just saw his thread as well—it seems we are both struggling with the exact same hardware limitations on this new platform.

To clarify the model name: I am based in China. Since the 6GHz band is not yet open for civilian use here, ASUS released the GT-BE96_AI as a regional flagship. It is essentially the GT-BE19000_AI (the model you found) but with the 6GHz radio/support removed or disabled to comply with local regulations.

The GT-BE96_AI (Mainland China version) features a 2.4GHz + 5.2GHz + 5.8GHz tri-band configuration. This is different from the GT-BE19000_AI, which uses the standard 2.4GHz + 5GHz + 6GHz setup.

Under the hood, they share the same Synaptics SL1680 SoC and the "AI Board" architecture, which is where these software compatibility issues are stemming from.
 
Hi everyone,

I recently upgraded to the ASUS GT-BE96_AI, the current flagship in the ASUS router lineup. While the networking side (Wi-Fi 7) is impressive, I’m finding the "AI Board" platform to be quite disappointing due to significant compatibility issues with its Synaptics SL1680 SoC.

Despite the marketing focus on its 7.9 TOPS NPU, the software ecosystem seems extremely closed, making it nearly impossible for enthusiasts to utilize that power. Here are the specific issues I've encountered:


1. Ollama / LLM Inference Issue: I tried View attachment 70315deploying Ollama on the AI Board. As it turns out, Ollama (and llama.cpp) cannot utilize the Synaptics NPU at all. It falls back entirely to the Quad-core A73 CPU, which immediately hits 100% load.


The root cause seems to be the proprietary Synaptics SyNAP driver framework. Since popular inference engines don't support SyNAP natively, this NPU is essentially "dead weight" for local LLMs.





2. Frigate NPU Acceleration: Even the native Frigate container provided within the ASUS AI Board interface fails to utilize the NPU for object detection. It appears the integration between the containerized environment and the SyNAP hardware layer is either broken or incomplete.

3. The Choice of SoC (SL1680 vs. RK3588): In hindsight, if ASUS had chosen the Rockchip RK3588, the community support (via rknpu) would have allowed seamless integration with Frigate, Home Assistant, and various LLM backends. By using the SL1680, we are stuck in a "walled garden" dependent on Synaptics' niche SDK.

My Questions to the Community:
Is there any word from ASUS or Synaptics regarding more open drivers or a more robust Docker integration for the NPU?

If we can't even run small models (1B-3B) with NPU acceleration, the "AI" branding on this router feels like a missed opportunity for the power-user community.

Looking forward to any insights or workarounds!
Wi-Fi 7 needs 2 to 3 more years to mature. Early adopters can experiment now, but for a stable experience, Wi-Fi 6 and 6E remain the best choices.

"Wi-Fi 7 is like a concept supercar: stunning on paper, but a headache to daily drive on today's roads. Stick with Wi-Fi 6/6E if you want a reliable 'daily driver'."
 

Latest threads

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Back
Top