If I'm not mistaken, you at least have the following bound:
$$
s(B) ; leq ; d^n ,-, exp left( -2 left( frac{ln B}{ln ( n! ) - dln ( (n/d)! )} right)^2 right) ,d^{n} .
$$
You may want to do a little fiddling around with Stirling's approximation to reduce the denominator in the fraction a bit further, but I didn't want to weaken the bound more than necessary.
I found this by upper-bounding the probability that a multinomial distribution with uniform bin probabilities yields an "atypical" observation, using Hoeffding's inequality. "Atypical" is here defined in the information-theoretic sense: An atypical observation is one whose logarithmic probability falls short of the expected logarithmic probability over the whole distribution (see Cover and Thomas' Elements of Information Theory, particularly chapter 11).
Some more details:
Let $p$ be point probabilities for a multinomial distribution with uniform bin probabilities:
$$
p(c) = left( frac{n!}{c_1!c_2!cdots c_d!} right) d^{-n}.
$$
Notice that
$$
ln p(c) ,+, nln d , = ln frac{n!}{c_1!c_2!cdots c_d!},
$$
and thus
$$
ln p(c) ,+, nln d ;leq; ln B ;quad iffquad frac{n!}{c_1!c_2!cdots c_d!} ;leq; B.
$$
Further, the entropy of the flat multinomial distribution is less than the entropy of $n$ samples from the flat categorical distribution with $d$ bins: The categorical includes order information as well as frequency information, while the multinomial only includes frequency information.
We thus have the following bound on the expected value of $-ln p(c)$:
$$
E left[, -ln p(c), right] leq
n cdot Hleft( frac{1}{d}, frac{1}{d}, ldots, frac{1}{d} right)
= nln d,
$$
or in other words, $-nln d leq E[ln p(c)]$.
Further, the minimum and maximum values of the random variable $ln p(c)$ are
$$
a
; =;
min_c ln p(c)
; =;
ln frac{n!}{n!, 0!, 0!, cdots, 0!} d^{-n}
; =;
-nln d;
$$
$$
b
; =;
max_c ln p(c)
; =;
ln frac{n!}{(n/d)!, (n/d)!, cdots, (n/d)!} d^{-n}
; =;
ln ( n! ) - dln ( (n/d)! ) - nln d.
$$
The squared distance between these two extremes is consequently
$$
(a - b)^2 = left( ln ( n! ) - dln ( (n/d)! ) right)^2.
$$
We can now make the following comparisons:
begin{eqnarray}
Prleft{ frac{n!}{c_1!c_2!cdots c_d!} < B right}
& = &
Prleft{ ln p(c) ,+, nln d < ln B right}
\
& leq &
Prleft{ ln p(c) ,-, E[ln p(c)] < ln B right}
\
& = &
1 ,-, Prleft{ ln p(c) ,-, E[ln p(c)] geq ln B right}
\
& leq &
1 ,-, exp left( -2 left( frac{ln B}{ln ( n! ) - dln ( (n/d)! )} right)^2 right).
end{eqnarray}
The first inequality follows from the fact that a proposition always has a lower probability than its logical consequences; the second is the application of Heoffding's inequality.
We thus have
$$
Prleft{ frac{n!}{c_1!c_2!cdots c_d!} < B right}
leq
1;-;exp left( -2 left( frac{ln B}{ln ( n! ) - dln ( (n/d)! )} right)^2 right).
$$
By multiplying by $d^n$, we obtain the inequality stated above, since the probability of a sample from a flat multinomial distribution is equal to the corresponding multinomial coefficient divided by $d^n$.