1)What are the conditions required to be satisfied to exist the composite function f(g(x))?
2)Do we need range of g to be a subset of domain of f or is intersection of mentioned range and domain sufficient and what is the domain of f(g(x)) ?
3)For what real values of k can f(g(x)) be defined?
Conditions for f(g(x)) to exist are,
* the range of g(x) must be a subset of the domain of f(x)
Or
*f(g(x)) must be defined with the domain of g(x) restricted so that its range is a subset of the domain of f(x).
2. This has been already answerd in the response to #1.
3. This question makes no sense. (what’s k?)
~MetaHub Panel for the Teachers’ Forum
Thank you very much for helping to clear doubts. In 1 st case I have not mentioned functions , therefore don’t we have to treat them as just polynomial expressions? In the 3 rd case I can not see k there , could you please explain little bit more Madam?
Really sorry for not pointing it out, that should be x instead of k Madam. Hope you may excuse me, still I couldn’t find how to use options like editing , deleting in my android.
1)When f(x) and g(x) are given as expressions you can just replace x in f by g(x) and it gives an expression for f(g(x)).
Let f(x) = x² , g(x) = √(x – 1) f(g(x)) = x – 1 and f(g(0))= -1
2)When domains of f and g given you can find the range of g , and f(g(x)) exists if the range of g is a subset of domain of f . In this case domain of f(g(x)) is same as the domain of the function g.
In the previous example if domain of f given as all real numbers and domain of g as real numbers not less than 1 , range of g is set of positive real numbers including 0 and it is subset of domain of f . Therefore f(g(x)) can be defined. But f(g(0)) can not be defined because 0 is not in the domain of the composite function. Since range of f is not a subset of domain of g , g(f(x)) can not be defined.
3)If you are asked to find f(g(x)) with it’s domain without giving domains of f and g , you need to restrict the domain of g in order to make range of g to be a subset of domain of f.
Let f(x) = √(x -1) , g(x) = x² here you need to restrict domain of g to (-∞ , 1] ∪ [ 1 , ∞ ) in order have range of g that is [ 1 , ∞ ) as a subset of domain of f.
Hope this answer would be sufficient to get the clear picture of the issue. I think better way to get this fully understood is to practice for few more problems. Appreciate your suggestions regarding this answer. Because of a system error sometimes your reply can be taken as another answer.