Yea, eye tracked foveated rendering (ETFR) would be an extremely good tool in a console environment and I personally hope that is Sony's focus over wireless VR implementation. The reason why I think these techs are not really compatible at this point is the extremely low latency (<2ms) and super high bandwith needed for ETFR to realiably send back and forth the gigantic amount of tracking data (especially if they include 4+ cam inside-out tracking, which is my guess) aswell as the video/audio streams, which I think wireless connections aren't ready to handle. Maybe some sort of laser connection can do it in the future.
The headset will need some processing build in for eye tracking and inside out tracking. That way it can reduce the data to a simple x,y,z + yaw,roll,pitch vector with x,y for eye position. It would be a waste of bandwidth to send all that video data to the console to be analyzed. The last second reprojection can also be done by the headset to remove any lag with head tracking. Then the headset can double the frame rate locally.
Wireless also has the problem of power supply. Batteries add weight, which adds discomfort or hassle by having to carry the battery pack on your body. Plus it adds costs as you still need to transmit at least a 1080p60 stream to the headset, likely 4K60 next gen, while compression and wireless transmission adds extra lag plus compression artifacts are more visible when blown up to 100 degrees fov. 1080p60 8 bit is already 2.8 Gbps of uncompressed data.
You can always offer both or a wireless add on, like a replacement cable with small battery pack to be carried on your body.