1 Introduction and notations

The graphs considered in this paper are finite and simple. We use Bondy and Murty (1976), Molloy and Reed (2002, 1998) for terminologies and notations not defined here. Let \(G=(V,E)\) be a graph. A vertex coloring of a graph \(G\) is called acyclic if it is a proper coloring such that every cycle \(C\) receives at least three colors. The acyclic chromatic number of \(G\), denoted by \(A(G)\), is the least number of colors in an acyclic vertex coloring of \(G\). The concept of acyclic colorings of graphs was introduced by Grünbaum (1973) and some results has been obtained in Alon et al. (1991), Gerke and Raemy (2007), Greenhill and Pikhurko (2005). In Alon et al. (1991), Alon, McDiarmid and Reed gave upper and lower bounds for the acyclic chromatic number. They proved that for some constants \(c>0\),

$$\begin{aligned} \frac{c\Delta ^{\frac{4}{3}}}{(log \Delta )^{\frac{1}{3}}}\le A(G) \le 50 \Delta ^{\frac{4}{3}}. \end{aligned}$$

In Goncalves et al. (2014) by using entropy compression method, proved the following result.

Theorem 1

Let \(G\) be a graph with maximum degree \(\Delta \),

$$\begin{aligned} \frac{c\Delta ^{\frac{4}{3}}}{(log \Delta )^{\frac{1}{3}}}\le A(G) \le \frac{3}{2} \Delta ^{\frac{4}{3}}. \end{aligned}$$

In this paper, we gave a class of graphs whose acyclic chromatic numbers are linear in their maximum degrees. Actually, we obtain the following result.

Theorem 2

Let \(G\) be a graph with maximum degree \(\Delta \ge 4\) and girth \(g\ge 4\Delta \). Then \(A(G)\le 12\Delta \).

2 Proof of Theorem 2.

We make use of the Lováze Local Lemma as our important tool in the proof of Theorem 2. Before giving the proof of Theorem 2, we state the general version of the Lováze Local Lemma(see Molloy and Reed (2002) for details) as follows.

Lemma 2.1

Molloy and Reed (2002) Let \(A_1, A_2,\cdots , A_n\) be events in an arbitrary probability space. Let the graph \(H=(V,E)\) on the nodes \(\{1,2,\cdots ,n\}\) be a dependency graph for the events \(A_i\); that is, assume that for each \(i\), \(A_i\) is independent of the the family of events \(\{A_j: (i,j)\notin E\}\). If there are reals \(0\le x_i<1\) such that for all \(i\),

$$\begin{aligned} Pr(A_i)\le x_i \prod \limits _{(i,j)\in E}(1-x_j), \end{aligned}$$

then

$$\begin{aligned} Pr\left( \bigcap _i\bar{A_i}\right) > 0. \end{aligned}$$

Proof of Theorem 2

Let \(G=(V,E)\) be a graph of size \(m\). Our aim is to prove that there exists a coloring of vertices of \(G\), \(f: V\rightarrow \{1,2,\cdot \cdot \cdot ,k\}\) such that \(f\) satisfies the following two properties.

  1. (i)

    every two adjacent vertex have different colors;

  2. (ii)

    there are at least \(3\) colors in every cycle \(C\) of \(G\).

\(\square \)

Then we obtain an acyclic coloring of \(G\)

For each vertex \(v\in V(G)\), first we do the following random coloring. Put \(x=12\Delta \). Then let \(f:V\rightarrow \{1,2,\cdots ,x\}\) be a random vertex coloring of \(G\), where for each vertex \(v\in V\), the color \(f(v)\in \{1,2,\cdots ,x\}\) is chosen randomly and independently according to a uniform distribution on \(\{1,2,\cdots ,x\}\).

For the sake of using Lemma 2.1(Local Lemma), we consider the following two type of bad events.

Type I. For each pair of adjacent vertices \(u\) and \(v\) of \(G\), let \(A_{u,v}\) be the event that

$$\begin{aligned} f(u)=f(v). \end{aligned}$$

Type II. For each even cycle \(C\), we say that event \(A_C\) occurs if \(C\) receives two colors under the random coloring \(f\). If the length of cycle \(C\) is \(i\), then such an event is called an event of type \(i\).

Then we obtain the following proposition.

Proposition 2.2

If no event of \(Type\) \(I\) or \(Type\) \(II\) occurs, then \(f\) is an acyclic coloring satisfying properties (i) and (ii).

It is obvious that Proposition 2.2 is true. In the following we will show that with positive probability none of these two kinds of bad events happens, then we can apply the Local Lemma 2.1 to prove Theorem 2.

Let us construct the dependency graph \(H\) needed in Lemma 2.1. We use \(X\) to denote a set that consists of two adjacent vertices in \(G\) colored by the same color in random coloring \(f\), or an even cycle in \(G\) which is colored by 2 colors in random coloring \(f\). Let \(V(H)=\{A_{X}|A_{X} \) is an event of Type I or Type II}. For each pair of nodes \(A_{X},A_{Y}\in V(H)\), \(A_{X}\) and \(A_{Y}\) are adjacent if and only if \(X\cap Y\ne \emptyset .\) Since the occurrences of each \(A_{X}\) depends only on the colors of the vertices in \(X\), \(H\) is a dependency graph for our events. In order to apply the Local Lemma, we have to estimate the probability of every event and the number of nodes of each type in graph \(H\) which are adjacent to any given node.

Since all the cycles have length at least \(g\), we only consider the cycle with length at least \(g\) in the following.

It is easy to see that an event of type-1 happens with probability \(\frac{1}{x}\). We have the following claim.

Claim 1

The probability of an event of type-\(i\) is at most \((\frac{2}{x})^{i-2}\left( {\begin{array}{c}i\\ i-2\end{array}}\right) \).

Proof

Let \(C\) be an even cycle of length \(i\ge g\), which is colored with \(2\)-colors. Then there exists a subset \(B\) of the vertices of \(C\) of size \((i-2)\) such that every vertex in \(B\) receives a color that appears also in \(C\backslash B\). There exists \(\left( {\begin{array}{c}i\\ i-2\end{array}}\right) \) ways to choose such a set \(B\). If a set \(B\) and a coloring of \(C\backslash B\) is fixed, then the probability that only colors of \(C\backslash B\) are used for \(B\) is at most \((\frac{2}{x})^{i-2}\). This complete the proof of Claim 1. \(\square \)

Proposition 2.3

In the dependency graph \(H\), each vertex of type-1 is adjacent to at most \(2\Delta \) vertices of type-1 and is adjacent to at most \(2{\Delta }^{i-1}\) vertices of type-\(i\). A vertex of type \(i\) is adjacent to at most \(i\Delta \) vertices of type-1 and at most \(i{\Delta }^{j-1}\) vertices of type-\(j\).

It is obvious that Proposition 2.3 holds. Let \(x_1=\frac{1}{8\Delta }\) and \(x_i=\frac{1}{(2\Delta )^i}\). In order to conclude that with positive probability none of the bad events hold, it suffices to prove that the following Claim 2 holds.

Claim 2

It holds that

$$\begin{aligned} Pr(A_1)\le x_1(1-x_1)^{2\Delta }\prod \limits _{j=g}^{n}(1-x_j)^{2\Delta ^{j-1}}; \end{aligned}$$

For all \(g\le i\le n\),

$$\begin{aligned} Pr(A_i)\le x_i(1-x_1)^{i\Delta }\prod \limits _{j=g}^{n}(1-x_j)^{i\Delta ^{j-1}}, \end{aligned}$$

where \(n\) is the order of \(G\).

Proof

Note that for all \(0\le t\le \frac{1}{24}\), \((\frac{1}{3})^t\le (1-t)\). As \(x_i\le \frac{1}{24}\) for all \(1\le i\le n\), \((\frac{1}{3})^{x_i}\le (1-x_i)\). Recall that \(g\ge 4\Delta \ge 16\). Hence it follows that

$$\begin{aligned} \begin{aligned} x_1(1-x_1)^{2\Delta }\prod \limits _{j=g}^{n}(1-x_j)^{2\Delta ^{j-1}}&\ge x_1\left( \frac{1}{3}\right) ^{2\Delta x_1}\prod \limits _{j=g}^{n}\left( \frac{1}{3}\right) ^{2\Delta ^{j-1}x_j}\\&\ge x_1\left( \frac{1}{3}\right) ^{2\Delta x_1+\sum _{j=g}^{\infty }\left( \frac{2}{\Delta 2^j}\right) }\\&=\frac{1}{8\Delta }\left( \frac{1}{3}\right) ^{\frac{1}{4}+\left( \frac{2^{-g+1}}{\Delta }\right) \sum _{j=0}^{\infty }{2^{-j}}}\\&\ge \frac{1}{8\Delta }\left( \frac{1}{3}\right) ^{\frac{1}{4}+\left( \frac{2^{-14}}{3}\right) }\\&\ge \frac{1}{12\Delta }=\frac{1}{x}\ge Pr(A_1). \end{aligned} \end{aligned}$$

For all \(i\ge g\ge 4\Delta \), we conclude that

$$\begin{aligned} 3e\Delta i\le 3^{\left( \frac{7}{8}-\left( \frac{1}{\Delta 2^{g-1}}\right) \right) \left( \frac{i}{2}\right) } \end{aligned}$$

as can be seen as follows. Since when \(i=4\Delta \), the left hand side of inequality equals to \(12e{\Delta }^2\) and the right side of inequality is equal to \(3^{\frac{7}{4}\Delta -\frac{2}{2^{g-1}}}\). If \(\Delta =4\) and \(g\ge 16\), by computation one can see \(12e{\Delta }^2\le 3^{\frac{7}{4}\Delta -\frac{2}{2^{g-1}}}\), then for \(\Delta \ge 4\) and \(g\ge 16\), it is obvious that the inequality also holds; moreover, if seen as a function of \(i\), then the right side grows much faster than the left side. So for \(i\ge g\),

$$\begin{aligned} \begin{aligned} P(A_i)&\le \left( \frac{2}{x}\right) ^{i-2}\left( {\begin{array}{c}i\\ i-2\end{array}}\right) \\&=\left( \frac{2}{x}\right) ^{i-2}\left( {\begin{array}{c}i\\ 2\end{array}}\right) \\&\le \left( \frac{2}{x}\right) ^{i-2}\left( \frac{ei}{2}\right) ^{2}\\&=\left( {\frac{1}{6\Delta }}\right) ^{i}(3e\Delta i)^{2}\\ \end{aligned} \end{aligned}$$

Since \(3e\Delta i\le 3^{(\frac{7}{8}-(\frac{1}{\Delta 2^{g-1}}))(\frac{i}{2})}\), we know that

$$\begin{aligned} \begin{aligned} P(A_i)&\le \frac{1}{(2\Delta )^i}\left( \frac{1}{3}\right) ^{\frac{i}{8}+\frac{i}{\Delta 2^{g-1}}}\\&=\frac{1}{(2\Delta )^i}\left( \frac{1}{3}\right) ^{\frac{i}{8}+\left( \frac{i2^{-g}}{\Delta }\right) \sum _{j=0}^{\infty }2^{-j}}\\&\le \frac{1}{(2\Delta )^i}\left( \frac{1}{3}\right) ^{\frac{i}{8}+\sum _{j=g}^{n}\frac{i}{\Delta 2^j}}\\&=x_i\left( \frac{1}{3}\right) ^{i\Delta x_1}\prod \limits _{j=g}^{n}\left( \frac{1}{3}\right) ^{i\Delta ^{j-1}x_j}. \end{aligned} \end{aligned}$$

Since for all \(0\le t\le \frac{1}{24}\), \((\frac{1}{3})^t\le (1-t)\), we obtain

$$\begin{aligned} \begin{aligned} P(A_i)&\le x_i(1-x_1)^{i\Delta }\prod \limits _{j=g}^{n}(1-x_j)^{i\Delta ^{j-1}}. \end{aligned} \end{aligned}$$

This completes the proof of Claim 2.

Therefore, by Claims 1, 2 and Lemma 2.1 we know that Theorem 2 is true. \(\square \)