Let's say we have a graph f(x). f(x) equals some other random equation like x^3 or something.
Now let's say g(x) is a transformation of f(x). Take a look at the following implications looking at different values of g(x)
Let's say g(x) = f(Ax)
if A = 1, then the graphs are equal obviously.
if A = 2, then the graph is compressed horizontally
if A = 5, then the graph is compressed horizontally even further.
if A = 0.5, then the graph is stretched horizontally
if A = 0.1, then the graph is stretched horizontally even further
As you can see, as A increases, the graph is compressed horizontally. And as A decreases, the graph is stretched horizontally. Using that logic, the further A decreases, then the more the graph is stretched.
But that's not the case. Because once A dips below zero, it's like the graph reverses or something. As you go from 1 to 0.1 to 0.01 to 0.001...etc. the graph continues to stretch. But as you go from to -0.01 to -0.1 to -1 etc, the graph begins to compress, which completely changes the direction it was previously going in. This doesn't make sense to me. Shouldn't the effect of increasing or decreasing A be continuous even when passing zero?
I understand the rules and know how to use it I started thinking about the effects of a graph as you change A, and this just seemed strange. As with every other transformation (verticle stretch, vertical shift, horizontal shift), the change seems to correlate continuously with their value but not with horizontal stretches. Does anyone find this strange?









