What about dynamic resolution? Can you tell when a game drops resolution during explosions and stuffs?
I can normally tell when a game shifts resolution during gameplay however that does become harder to notice as normally it happens in intense scenes which means you are often distracted playing the game while its happening. But to answer your question, its difficult to tell the difference.
In my experience, there is not an eye-popping difference between 1440p and 4K in games. A difference, yes, but not that overblown. This comes from someone who games in 1440p/144Hz on PC and 4K (Dynamic/Native ~ this matters) 30-60fps on the Xbox One X. I generally prefer the former because the difference is not as big as one would expect and PC games have better graphics anyway. Once higher res monitors take advantage of higher Hz in abundance (I'm waiting on 8K/120-144Hz for PC, since that's when rendering will truly makes the difference), that is when things will stomp 1440p.
That said, watching movies in 4K vs 1440p is where the difference grows on the eye, and HDR is a Godsend, but for gaming the difference isn't as big when just talking those two resolutions.
That also depends on the size of your screen. I am guessing you game on a Monitor?
On a 55inch TV, 4k is quite noticeable and I have fiddled around with resolution settings in games to tell the difference and I can easily see the improvements on my screen.
Most PC gamers use smaller monitors around the 22inch to 32inch size. That size is perfect for 1080p and 1440p however if you went with a 55inch to 75inch TV, you can easily see the difference between 4k and 1440p. However I am in the minority here because majority of PC gamers don't use a TV to game, they normally opt for refresh rates in monitors instead.
I will be buying a decent monitor next year as I am moving house and I am looking at something good, if I stick to a 32inch size I probably wouldn't need 4k and can lower it to 1440p and increase framerates in my rig instead.