percent calculation my friend.
It's 23% less than 30fps. So 23fps is a 23% decrease over 30fps.
At the same time 30fps is 30/23*100%=130.43%
130.43%-100%=30.43% increase.
Or 7/23*100%=30.43%, again.
It's like Xbox One having 29% shader power less than PS4 but PS4 having 43% more than Xbox One.
No offense, but i've seen a load of poeple with problems in percent calculation.
Now for the other stuff.
Pretty much every major title uses some kind of culling today. That was an issue in pre-DX8 days when routines for such things wheren't supported in hardware usually and had to be written by hand.
ST Kyro (Power VR based) was well kbnown back then for very efficient hidden surface removal because of it's then new tile based renderer. Without hardware T&L it could be faster in many games than a Geforce with higher fillrate and hardware T&L.
Draw call batching is commonly used by game engines, often by default. It's optional in Unity though. There is no perfect batching. Seems like batching for Anvil was changed multiple times which is one of the reasons why later entries had less issues where AC1 and to a lesser degree AC2 had framerate drops.
CryEngine uses batching by default. AFAIR all the planes in the Unity scenario would be rendered in one batch automatically because they use the same materials.
Now for Anti Aliasing:
pCARS (PC) uses selectable FXAA, SMAA, MSAA and downsampling. Console versions use EQAA, but with the slightly older Wii U GPU and less power i am pretty sure they went for either FXAA or SMAA only already.
You are making the guess that they don't use draw call batching at all because of Unity. That is without reading documentation of other game engines i'd guess.
Same for culling.
I'd recommend you to at least read the CryEngine and Unreal engine docs.