Apple's visionOS 26.4 brings PC VR Foveated Streaming to Apple Vision Pro, revolutionizing wireless VR remote rendering. But here's where it gets controversial: foveated streaming is not the same as foveated rendering, though the two techniques can be used alongside each other. Foveated streaming refers to sending the area you're looking at with higher resolution, while foveated rendering involves the host device rendering that area with higher resolution. This technology is already present in Valve's Steam Frame, where it's a fundamental always-on feature of its PC VR streaming offering, delivered via the USB PC wireless adapter. Given that the video decoders in headsets have a limited maximum resolution and bitrate, foveated streaming is crucial for enhancing VR experiences. Unlike macOS Spatial Rendering, which only supports a local Mac as a host, Apple's developer documentation describes the new Foveated Streaming as a low-level host-agnostic framework. The documentation highlights Nvidia's CloudXR SDK as an example host, while noting that it should also work with local PCs. Apple even has a Windows OpenXR sample available on GitHub, which is the first and only time the company has mentioned the industry-standard XR API. The lead developer of the visionOS port of the PC VR streaming app ALVR, Max Thomas, is currently looking into adding support for foveated streaming, but it will likely be 'a lot of work'. Because of how the feature works, Apple's foveated streaming might even enable foveated rendering for tools like ALVR. Normally, visionOS does not provide developers with any information about where the user is looking – Apple says this is in order to preserve privacy. But crucial to foveated streaming working, the API tells the developer the 'rough' region of the frame the user is looking at. This should allow the host to render at higher resolution in this region too, not just stream it in higher resolution. As always, this will require the specific VR game to support foveated rendering, or to support tools that inject foveated rendering. Interestingly, Apple's documentation also states that visionOS supports displaying both rendered-on-device and remote content simultaneously. The company gives the example of rendering the interior of a car or aircraft on the headset while streaming the highly detailed external world on a powerful cloud PC, which would be preferable from a perceived latency and stability perspective to rendering everything in the cloud. We'll keep an eye on the visionOS developer community in the coming months, especially the enterprise space, for any interesting uses of Apple's foveated streaming framework in practice.