To answer this question, consider two different ways of decomposing joint probabilities:
Both equations allow to simplify further given two types of independence assumptions:
The simplified eq. (1) now only involves terms with two events at most. Can we also achieve this for eq. (2) by further decomposing the first term $P(A | B, C)$? The answer is no. Let us look at an example.
Here, it holds that $P(B, C) = P(B) P(C)$. But, even though $B$ and $C$ are independent, $A$ depends on the combination of them. Can the probability $P(A| B, C)$ be further simplified, when allowing an arbitrary function $f$
Here, simpler terms means $P(B)$, $P(C)$ etc.
Let's show that this is not possible.
Assume $B=0, C=0$ and consider the cases $A=0$ / $A=1$. Evaluate equation (3) above, which gives
This is impossible if the simpler terms are the same in both equations, which would be the case if taking terms like $P(B)$, $P(C)$. The equation can only agree if one adds pieces of the configuration $(A, B, C)$ as arguments to the function. But then, the complexity of the function increases so much that one would no longer refer to it as decomposition or simplification.
Instead of demanding an exact equality, you can see equation (3) as an ansatz in which you're free to choose any function. In this case, you should do some thorough checks on the errors that you generate. To make this transparent, start with a simple ansatz, something that's linear either directly or in log space.
Let us now also give an example for eq. (1) assuming conditional independence:
Then but and simplifying eq. (1) holds as given above.