1 Introduction

In the era of technology and development with the ever-growing advancement of both computer and Internet, multimedia data (audios, videos and images ) is being extensively used in diverse applications, for example, military, E-commerce, Telemedicines, video conferencing, broadcasting and financial transaction, etc. Digital imaging applications are widespread and rapidly increasing day by day, yet the major hurdles in the expansion of applications and services are security, storage and confidentiality [1]. Since all the existing transmission mediums or networks which are either wired or wireless are insecure, transmitted data can be interrupted, intercepted and modified [2]. In such a scenario, a natural question about the security and confidentiality of multimedia data arises. The solution is provided by cryptography, the art of science which is currently considered as a branch of both computer science and mathematics, i.e., cryptography. Cryptography is defined as a science surrounding all the principle, rules, sets of instructions and methods for converting understandable and clear message into the form that is unintelligible and then reconverting that unintelligible message into original form [3]. To provide security, authenticity and confidentiality for multimedia data, two commonly known technologies are encryption and watermarking [4].

Encryption is defined as the process of translating plaintext message into a form known as ciphertext message. This ciphertext message should not be read by anyone without a process known as decryption. Decryption is a reverse process of encryption which transforms the ciphertext back into the plaintext. Encryption is a process of applying special mathematical algorithms (set of rules) and keys to convert the original message into ciphertext while decryption involves the use of algorithms to obtain the original message back. Digital watermarking [5] is used to hide or embed information in multimedia data, so that the information becomes protected from illegal copying, manipulation and modification.

Watermark is classified on the basis of its application as visible or invisible watermark [6]. A visible watermark is typically embedded in digital image which consists of a clear visible message or a company logo indicating the ownership of the image. For example, in most of the currency bills, a visible watermark is typically embedded to distinguish bogus and genuine currency. In invisible digital watermarking, a signal is added in multimedia data such that it cannot be perceived [7, 8]. A digital watermarking scheme can be divided into two main areas: symmetric and asymmetric. In symmetric watermarking, keys are symmetric or identical during watermark embedding and detection process. If keys for watermark embedding and detection are different, then this type of watermarking is called the asymmetric watermarking [912].

An encryption algorithm can be divided into two types, block cipher and stream cipher. A block cipher is a type of encryption algorithm in which a block of plaintext is treated as a whole, and the output produced is a ciphertext block, where the block lengths of plaintext and ciphertext are the same. A block cipher encryption algorithm, for example, can take a 128-bit block of plaintext as input, and output a corresponding 128-bit block of ciphertext. Basically block cipher is a symmetric key cipher, which means that all blocks are encrypted and decrypted with the same key. For greater security, block length and key size are kept larger. A stream cipher is a type of encryption algorithm in which a digital data stream is encrypted one bit or one byte at a time. Examples of classical stream ciphers are the autokeyed Vigenre cipher and the Vernam cipher [13]. The basic purpose of using a stream cipher is to design algorithms which are exceptionally faster than a typical block cipher. Stream ciphers are often used in order to lower hardware complexity and to execute at a higher speed than block ciphers [14]. Block ciphers have the advantage over the stream ciphers that a large block can be divided into a number of small blocks which can be serially encrypted [15].

There are many types of encryption algorithms depending upon the applications and requirements. One classification on the basis of volume of data to be encrypted is complete vs selective encryptions [5]. As their names indicate, in complete encryption, the whole data is encrypted, whereas in selective one, only a portion of the content is encrypted. Both of them have their own pros and cons. On the basis of security, complete encryption has higher security level because the whole multimedia data is transformed into unreadable form. In selective encryption algorithms, since only a portion of multimedia content is encrypted, security level declines accordingly. Complete encryption has low efficiency due to the large size of data to encrypt, while in selective encryption, less time is required for encryption/decryption, resulting in high efficiency of algorithms. Complete encryption is also known as direct encryption and selective encryption as partial encryption [13, 16].

Encryption algorithms can be classified into two major types, symmetric and asymmetric key algorithms [13, 16]. In symmetric key algorithm, the same key is used for encryption and decryption. That is why it is also known as private key algorithm or one-key encryption [13, 16]. DES (Data Encryption Standard), 3DES (Triple DES), AES (Advanced Encryption Standard), CFES (Compression Friendly Encryption Standard) and CCCMES (Chaotically Coupled Chaotic Map Encryption Scheme) are the examples of symmetric key algorithms. Asymmetric key algorithm, also known as public key algorithm [13, 16], makes use of two different keys for encryption and decryption of the message, respectively. These two keys are called the public key and the private key. To encrypt an original message, public key is used and for decryption, private key is used. RSA is the most common example of asymmetric key algorithm. Symmetric key algorithm has an advantage of high speed over asymmetric one. In asymmetric key cryptography, speeds of encryption and decryption are slow and it is considered appropriate for short messages such as keys.

A common question which arises is that when the field of cryptography is already well matured, why are new image encryption techniques required [17]? Traditional cryptographic techniques, like block cipher, change in a single bit of the encrypted image can cause a complete decryption failure. This is because traditional cryptographic techniques have been designed for text-based applications where each bit should be correctly decrypted to ensure the successful decipherment of the transmitted message. The situation is a bit different in multimedia applications like images. What matters in digital images is the content of an image rather than the exact pixels values. For example, Fig. 1a, b show an original and its JPEG compressed Cameraman images, respectively. Although both images are perceptually the same, however, the pixels values are different. Lossy compression, enhancement and geometric transformation are common operations for digital images. If an image is encrypted using a traditional encryption scheme like AES, and then passed through JPEG lossy compression, the decryption will totally fail. In conventional cryptographic techniques the decrypted data is exactly the same as the original or plaintext data. However, this is not necessary requirement for multimedia data that involves audio, image or video. As discussed above, in most of the multimedia applications, an approximation of the original multimedia content is sufficient and small distortion is acceptable due to human visual perception [17].

Fig. 1
figure 1

Illustration of manipulation in an image. a Original Cameraman image. b Compressed version of the image shown in Fig. 1.1a, JPEG QF = 70

Recently, as exchange of multimedia data has dramatically increased over the Internet, security issues of multimedia are also emerged. This has motivated researchers to develop novel multimedia encryption schemes [12, 13, 16]. Although these encryption schemes meet various application requirements, they are still not mature. One aspect that needs attention is the security analysis of the schemes proposed in the literature. A number of encryption schemes have been found to be insecure [2, 12, 18, 19]. This provides the basic motivation behind this research. Multimedia encryption and decryption schemes should be designed in such a way that it encompasses a high level of security and efficiency [19]. In image encryption algorithms, the most important issue is how to determine the quality of encryption in terms of security and efficiency [20]. The main theme of this paper is to investigate the security and efficiency of some traditional image encryption schemes. As such, there is a real need to develop techniques that can address the challenges in multimedia content and services. A general framework presented in [1] is used for the evaluation of image encryption schemes.

The quality of an image encryption scheme can be judged by visual inspection, but in some cases it may not give any indication about the hidden loopholes. In this research, the primary objective is to study a number of parameters that help to evaluate an image encryption scheme. Using these parameters, a comparison of conventional encryption schemes like Advanced Encryption Standard (AES), Compression Friendly Encryption Scheme (CFES) [17], Chaotically Coupled Chaotic Map Encryption Scheme (CCCMES) [21] and Bernoulli Map Based Encryption Scheme (BMBES) [22] is made to demonstrate the effectiveness of these schemes.

The rest of paper is organized as follows. Section 2 discusses the well known AES algorithm, CFES [17], CCCMES [21] and BMBES [22]. As visual inspection is not sufficient to judge the amount of features hidden in an encryption scheme, Sect. 3 discusses parameters to evaluate an image encryption scheme. Using these security parameters, a comparison study is also carried out among AES, CFES, CCCMES and BMBES. All the schemes were analyzed using parameters like correlation coefficient, information entropy, compression friendliness, Mean Square Error (MSE), Number of Pixel Change Rate (NPCR), Unified Average Change Intensity (UACI) and key sensitivity. The paper ends with conclusion in Sect. 4.

2 Overview of AES, CFES, CCCMES and BMBES

In this section, an overview of four traditional schemes (AES, CFES, CCCMES and BMBES) is discussed. Both AES and CFES are non-chaotic schemes, while CCCMES and BMBES are chaos-based image encryption schemes. For better understanding of security features, we briefly explain some fundamental knowledge of these schemes.

2.1 Overview of Advanced Encryption Standard

AES is the abbreviation of Advanced Encryption Standard adopted by the US government [23]. AES algorithm is a symmetric block cipher and used for electronic data encryption [13, 16]. AES was published by NIST (National Institute of Standard and Technology) in November 2001 and supersedes DES (Data Encryption Standard). In 1970, DES was designed and used for hardware implementation and does not produce efficient software code [13, 16, 19]. Then triple (3DES) was introduced to overcome the drawback of DES. 3DES has no cryptographic attack based on the algorithm itself except the brute force attack [13, 16, 19]. With respect to security, 3DES is very resistant against cryptanalysis. If security was the only concern, then 3DES is a strong candidate or an appropriate choice for a standard encryption algorithm. The major drawback with the 3DES is that it is very slow since it takes three times as many rounds as DES. DES and 3DES use a block size of 64 bits. For higher efficiency and security, larger block size is required. Due to the drawbacks of DES and 3DES, NIST issued a call for proposals in 1997 for a new encryption standard. NIST specified in its proposal that the new algorithm must be symmetric block cipher with block length of 128 bit and support key lengths of 128, 192 and 256 bits [13, 16, 19]. After the evaluation of several algorithms, NIST selected the Rijndeal algorithm as the de facto standard. Rijndeal’s algorithm which is now known as Advanced Encryption Standard (AES) was developed by two Belgian cryptographers, Joan Daemen and Vincent Rijmen. As compared to DES, AES has stronger security and improved efficiency. Interested readers can find technical details of AES in [23].

2.2 Compression Friendly Encryption Scheme

In the era of communication, all the encryption techniques faced the same problem of storage capabilities. Most commonly used encryption techniques such as DES (Data Encryption Standard), 3DES (triple Data Encryption Standard) and AES are not compression friendly, i.e., even a one bit change in the ciphertext will result in the complete failure of decryption process. So to overcome this major issue for the recovery of original information the authors in [17] proposed a new algorithm titled “Compression Friendly Encryption Scheme.” The most distinguishing property of this technique is the capability to tolerate JPEG compression. This means that if the encrypted image is JPEG compressed, decryption is possible with some acceptable distortion.

In [17] the authors had designed an image encryption technique which not only fulfils the requirement of the cryptography but is also capable of withstanding the JPEG lossy compression. An important point to highlight here is that CFES is not proposed for lossless encryption. This encryption algorithm is capable of generating an image perceptually identical to the original plaintext with a high value of PSNR. To recover the original image with the exact value of pixels of image is not the part of CFES as in most of multimedia applications the image with reasonable perceptuality is acceptable rather than the exact pixel value of the recovered image. Depending on the requirements or applications, low level or high level encryption is achievable because it is capable of generating cipher images having variable perceptual distortions. The CFES encryption and decryption are shown in Fig. 2. Detail steps are given in [17].

Fig. 2
figure 2

Block diagram of CFES [17]

2.3 Chaotically Coupled Chaotic Map Encryption Scheme

Chaotic systems have several significant features favourable to secure communication such as sensitivity to initial conditions, ergodicity, pseudo random property, deterministic, mixing (stretching and folding) and complexity. These are basically related to two important properties of cipher: confusion and diffusion mechanisms. Confusion mechanism rearranges the pixel values while diffusion mechanism changes the values of each pixel. Confusion and diffusion process can be repeated many times to obtain a higher security level [24]. Recently many different chaotic cryptosystems have been proposed. Security analysis of these new proposed chaotic cryptosystems needs a great attention. One example of chaotic base encrption scheme is proposed in [21]. The proposed encryption scheme in [21] is based on chaotically coupled chaotic maps. This scheme can be proved secure even with a single map due chaotic nature of maps. Chaotic discrete systems generate a periodic sequence, however, period increases exponentially with the number of coupled maps and hence maps are coupled for better security [21]. This cryptosystem utilizes only an essence of chaos: high sensitivity of the chaotic trajectory to initial conditions and reoccurrence property of chaotic trajectory. Authors [21] explore the parameters which only produce chaotic behaviour like chaotic maps. A block diagram of CCCMES is shown in Fig. 3. Steps shown in Fig. 3 are discussed in more detail in [21].

Fig. 3
figure 3

Block diagram of CCCES [21]

2.4 Bernoulli Map Based Encryption Scheme

In [22], an efficient image encryption scheme based on generalized Bernoulli map is proposed. Both confusion and diffusion properties have been added in Bernoulli Map Based Encryption Scheme (BMBES). In confusion process, pixels positions are shuffled, while in diffusion, pixels values are modified by using generalized Bernoulli shift maps. First of all, pseudo-random numbers have been generated by utilizing Bernoulli shift maps. Permutation is carried out by using the methodology of sorting which is then applied on pseudo random numbers. To add diffusion characteristics in BMBES, two generalized Bernoulli shifts maps have been used to generate two pseudo-random grayscale value sequences. These new generated Pseudo- random numbers are utilized to modify the pixel gray values sequentially. This scheme is highly resistive to all known attacks because of sensitive initial conditions for generalized Bernoulli map. Encrypted image changes unpredictably even with one bit change in a plaintext image. Confusion and diffusion process of this scheme is shown in Figs. 4 and 5, respectively. Detail analysis of BMBES is discussed in [22].

Fig. 4
figure 4

Confusion process of BMBES

Fig. 5
figure 5

Diffusion process of BMBES

3 Experimental Results and Evaluations of AES, CFES, CCCMES & BMBES

This section discusses the security analysis of AES, CFES, CCCMES and BMBES. Comparison of these schemes is carried out using the security parameters like correlation coefficient, information entropy, compression friendliness, number of pixel change rate and unified average change intensity, etc. Some interesting properties of these encryption schemes are presented in the subsequent sections.

3.1 Correlation Coefficient Analysis

Correlation coefficient is a statistical technique that measures the quality of encryption based on the linear relationship between two variables. In the case of image encryption, these variables are plaintext and ciphertext. Correlation coefficient shows the measure of dependence and strength between two quantities [33]. It takes on values ranging between +1 and \(-\)1. Zero correlation means that the correlation statistic did not indicate a relationship between the two variables. If the correlation coefficient is 1, it means that plaintext and ciphertext are highly dependent and there is a perfect correlation. In the case of perfect correlation, encrypted image is exactly the same as that of plaintext image. A negative correlation coefficient indicates a perfect negative linear relationship which means that encrypted image is negative of plaintext image. The smaller the value of correlation coefficient is, the better the quality of encryption is. Mathematically, correlation coefficient can written as [20, 25, 26]:

$$\begin{aligned} C\cdot C= & \frac{Cov(x,y)}{\sigma _{x} \times \sigma _{y}}. \end{aligned}$$
(1)
$$\begin{aligned} \sigma _{x}= & \sqrt{VAR(x)}. \end{aligned}$$
(2)
$$\begin{aligned} \sigma _{y}= & \sqrt{VAR(y)}. \end{aligned}$$
(3)
$$\begin{aligned} VAR(x)= & \frac{1}{N}\sum \limits _{i=1}^N(x_{i}-E(x))^{2}. \end{aligned}$$
(4)
$$\begin{aligned} Cov(x,y)= & \frac{1}{N}\sum \limits _{i=1}^N(x_{i}-E(x))(y_{i}-E(y)), \end{aligned}$$
(5)

where \(C\cdot C\) is correlation coefficient and \(Cov\) is covariance at pixels \(x\) and \(y\), \(x\) and \(y\) are the grayscale values of two pixels in the same place in the plaintext and ciphertext images. \(VAR(x)\) is variance at pixel value \(x\) in the plaintext image, \(\sigma _{x}\) is standard deviation, \(E\) is the expected value operator and \(N\) is the total number of pixels for \(N \times N\) matrix.

The well known Cameraman and Baboon images are used to test all encryption schemes for correlation coefficient analysis. Tests are performed by selecting 1000 pairs of adjacent pixels. Correlation coefficient between two horizontally adjacent pixels, two vertically adjacent pixels and two diagonally adjacent pixels, respectively, are performed. The simulation results are mentioned in Tables 1 and 2 for horizontally, vertically and diagonally adjacent pixels, respectively.

Table 1 Correlation coefficient of two adjacent pixels: Cameraman image
Table 2 Correlation coefficient of two adjacent pixels: Baboon image

The tables show that the correlation for vertical and diagonal adjacent pixels is close to zero, i.e., minimum for all above mentioned encryption schemes. The correlation between the pixels of plaintext image is maximum, i.e., near to 1. Horizontal correlation of adjacent pixels is minimum, approximately close to zero for all schemes except CFES.

3.2 Entropy Analysis

Entropy is an important parameter for analyzing an encryption scheme. Entropy is related with the measure of information contained in the data and shows the degree of unpredictability and randomness in a system [36]. In a technical term, entropy measures the level of difficulty to predict a system. In this context, the term usually refers to the Shannon entropy which was introduced by Claude. E. Shannon in 1948. The quality of image encryption is usually determined by the Shannon entropy over the ciphertext image [27]. If an image is encrypted, it decreases the mutual information among pixel values and thus increases the entropy. Entropy of a message \(m\) can be represented as \(H(m)\) for \(m\) symbols and \(p(m_{i})\), where \(p(m_{i})\) is the probability of occurrence of symbol \(m_{i}\). A proper secure system should meet a condition on information entropy such that a ciphertext image should not provide any information about the original image [28]. The entropy \(H(m)\) of any message can be calculated as [2931]:

$$\begin{aligned} H(m)= \sum \limits _{i=0}^{2^{N}-1} p(m_{i})\times \log _{2}\frac{1}{p(m_{i})}, \end{aligned}$$
(6)

Information entropy test has been performed for Cameraman and Baboon images of size \(256 \times 256\). For these images, the theoretical value is 8 bits. If an encryption scheme has less value than 8, then this scheme will be insecure because of the possibility of predictability. Simulation results for all schemes are shown in Table 3. As from the table, it can be noticed that the entropy value of CFES is less than other three schemes. So it is easy to predict the original image in the case of CFES and hence CFES is insecure against entropy attack. The value for entropy is greater than 7.98 for AES, CCCMES and BMBES. The entropy analysis shows that security is high for all schemes except CFES.

Table 3 Entropy results

3.3 Encryption Quality Measurement

An important issue in image encryption algorithms is the evaluation of the quality of encryption. Earlier studies on image encryption were based on visual inspection to judge the effectiveness of an encryption technique [20]. An image encryption algorithm is good, if it is able to conceal a large number of image features. In some scenarios, visual inspection is sufficient but it does not give an indication about the amount of information concealed. To judge the quality of encryption, a number of measuring techniques are proposed in the literature [1820, 25, 32].

3.3.1 Maximum Deviation

The maximum deviation measures the quality of encryption scheme in the sense that how it maximizes the deviation between plaintext and ciphertext [33]. The More the ciphertext deviated from the plaintext, the better the encryption algorithm is. Steps for the calculation of maximum deviation are shown in [34].

3.3.2 Irregular Deviation

The irregular deviation measures how much the statistical distribution of histogram deviation is close to uniform distribution. If irregular deviation is close to uniform distribution then the encryption algorithm is said to be good [20]. The irregular deviation is calculated as follows:

  1. 1.

    Take the absolute difference of plaintext (P) and the ciphertext (C) image [20].

    $$\begin{aligned} D=|P-C|, \end{aligned}$$
    (7)
  2. 2.

    Calculate the histogram of \(D\).

    $$\begin{aligned} H= histogram (D). \end{aligned}$$
    (8)
  3. 3.

    Let \(h_{i}\) be the amplitude of histogram at index \(i\). Then the average value of \(M_H\) is:

    $$\begin{aligned} M_{H}=\displaystyle \frac{1}{256}\sum \limits _{i=0}^{255}h_{i}, \end{aligned}$$
    (9)
  4. 4.

    Calculate the absolute of the histogram deviations using \(M_{H}\) as follows [20]:

    $$\begin{aligned} H_{D_i}=|h_{i}-M_{H}|. \end{aligned}$$
    (10)
  5. 5.

    Now irregular deviation \(I_{D}\) can be calculated [20]:

    $$\begin{aligned} I_{D}=\sum \limits _{i=0}^{255}H_{D_i}. \end{aligned}$$
    (11)

The smaller the value of \(I_{D}\) is, the better the encryption quality is.

3.3.3 Deviation From Uniform Histogram

Histogram shows the frequency distribution of pixels of an image. A histogram uses a bar graph in which the horizontal axis represents the gray level values and the vertical bar represents the corresponding number of gray levels [18]. The histogram associated to the encrypted image should hide the frequency distribution of original image. Using histogram, an attacker does frequency analysis to deduce the secret key known as statistical attack. To prevent statistical attack, the histogram of plaintext image and histogram of ciphertext image should not have any similarity. A ciphertext image should have uniform distribution for higher security. Relatively uniform distribution of the ciphertext image shows that encryption algorithm has a good quality. For an image encryption algorithm, the histogram of encrypted image possesses two important properties: (1) it should be totally different of the histogram of plaintext image; (2) it should have a uniform distribution which means that probability existence of each pixel value is the same and totally random. This can be formulated as [18]:

$$\begin{aligned} H_{C_i}=\left\{ \begin{array}{ll} \frac{M\times N}{256} &{} \quad 0\le C_{i}\le 255\\ 0 &{} \quad \text {elsewhere} \\ \end{array} \right. \end{aligned}$$
(12)

The deviation from uniform histogram shown by Eq. 12 is calculated as [18]:

$$\begin{aligned} D_{p}= \displaystyle \frac{\sum \nolimits _{C_i=0}^{255}|H_{C_i} -H_{C}|}{M\times N}, \end{aligned}$$
(13)

where \(H_{C}\) is the histogram of the encrypted image and \(H_{C_i}\) is ideally encrypted image which has a complete uniform histogram distribution. The lower the value of \(D_{p}\) is, the better encryption quality is.

The Cameraman and Baboon images are tested to evaluate maximum deviation \((M_{D})\), irregular deviation \((I_{D})\) and deviation from uniform histogram \((D_{p})\). Results for all schemes are shown in Tables 4 and 5. It can be observed that AES and BMBES have higher values of maximum deviation than CFES and CCCMES. Results of maximum deviation parameter highlights that AES and BMBES are more secure. CFES has a smaller value of maximum deviation and hence a lower level of information hiding. With respect to maximum deviation, AES can be a better candidate for encryption of data.

When the value of irregular deviation is less, the scheme is more secure. BMBES algorithm has a smaller value than all other three schemes. In terms of deviation , BMBES is better than all the other schemes because deviation of pixels values of plaintext and corresponding ciphertext is higher and random. In the case of \((D_{p})\) analysis, AES and BMBES have smaller values than CFES and CCCMES. The fact that both CFES and CCCMES have greater values indicates that ciphertext images are more deviated from their ideal histograms. One can say that, by the encryption quality measures, BMBES can be considered better as compared to AES, CFES and CCCMES.

Table 4 Encryption quality results for Cameraman image
Table 5 Encryption quality results for Baboon image

3.4 Avalanche Effect

Avalanche effect is a desirable property for checking the efficiency of diffusion mechanism. A single bit change in a plaintext image \(P\) can cause a significant modification in its corresponding ciphertext image \(C\). This effect is known as avalanche effect [35]. In block ciphers, a small change in key or plaintext should cause a drastic change in the ciphertext. Let \(C_{1}\) and \(C_{2}\) be two ciphertext images whose corresponding keys differ by one bit. The avalanche effect is the percentage of difference between \(C_{1}\) and \(C_{2}\). If \(C_{1}\) and \(C_{2}\) differ from each other in half of their bits, we can say that the encryption algorithm possesses good diffusion characteristics [36].

MSE can be calculated as [37, 38]:

$$\begin{aligned} MSE=\frac{1}{M \times N}\sum \limits _{i=0}^{N-1} \sum \limits _{j=0}^{M-1}\left[ C_{1}(i,j)-C_{2}(i,j)\right] ^{2}, \end{aligned}$$
(14)

where \(M\) and \(N\) is the width and height of images and \(C_{1}(i,j)\) is grayscale value of pixel at grid \((i,j)\) in ciphertext image \(C_{1}\) and \(C_{2}(i,j)\) is grayscale value of pixel at grid \((i,j)\) in ciphertext image \(C_{2}\). In [39], authors discussed \(MSE\) and generally speaking, if value obtained using Eq. 14 is ≥30 dB, the difference between two images is evident [39]. From Table 6, it is clear that all encryption schemes have MSE values greater than 30. It can be seen from Table 6, that chaotic map based encryption schemes have greater value of MSE as compared to non-chaotic encryption schemes. This feature highlights an interesting property of chaos-based encryption scheme that by changing one bit in a plaintext image the difference between ciphertext images are evident. It is due to the fact that chaotic maps is more sensitive to the change in a plaintext image. From Table 6, it is clear that CFES is less sensitive to the change in the plaintext images.

Table 6 Mean square error (MSE) results

3.5 Number of Pixels Change Rate (NPCR) and Unified Average Change Intensity (UACI)

To test the sensitivity of single bit change on a whole encrypted image, two common measures are used: NPCR and UACI. NPCR and UACI are two most widely used security analysis for differential attacks. Number of Pixels Change Rate (NPCR) shows the percentage of different pixel numbers between two encrypted images whose plaintexts have a difference of only one pixel. Unified Average Change Intensity (UACI) shows the differences of average intensities between two ciphertext images whose corresponding plaintext have a difference of only one pixel [40].

Let \(C_{1}\) and \(C_{2}\) be two different ciphertext images whose corresponding plaintext images differ by only one bit. Label the grayscale value of the pixel at grid \((i,j)\) in \(C_{1}\) and \(C_{2}\) by \(C_{1}(i,j)\) and \(C_{2}(i,j)\), respectively. We define an array \(D\) to have the same size as \(C_{1}\) and \(C_{2}\). Then \(D(i,j)\) is determined by using \(C_{1}(i,j)\) and \(C_{2}(i,j)\) as: if \(C_{1}(i,j) = C_{2}(i,j)\) then \(D(i,j) = 0\), otherwise \(D(i,j) = 1\).

The \(NCPR\) is defined as [15, 41]:

$$\begin{aligned} NPCR= \frac{\sum _{i,j}D(i,j)}{W\times H}\times 100\,\%, \end{aligned}$$
(15)

where \(W\) and \(H\) are the width and height of ciphertext images \(C_{1}\) and \(C_{2}\), respectively.

By using Eq. 15, the percentage of different pixel numbers between the plaintext image and the ciphertext image can be calculated. \(NCPR\) can also be defined as the variance rate of pixels in the encrypted image caused by the change of a single pixel in the original image [30].

Unified Average Change Intensity \((UACI)\) determines the average intensity of differences between two images. Mathematically, \(UACI\) can be defined as [37, 38]:

$$\begin{aligned} UACI=\frac{1}{W\times H} \left[ \sum _{i,j}\frac{C_{1}(i,j)-C_{2}(i,j)}{255}\right] \times 100\,\%. \end{aligned}$$
(16)

The higher the value of \(NPCR\) and \(UACI\) is, the better the quality of encryption is. From Tables 7 and 8, it is clear that CCCMES has good diffusion characteristics than AES, CFES and BMBES. With respect to NPCR and UACI, the results in Tables 7 and 8 show that CFES has less sensitivity to small changes in plaintext images. Generally, these results reflect that CCCMES has strong diffusion mechanism as compared to other schemes.

Table 7 Number of pixel change rate (NPCR) results
Table 8 Unified average change intensity (UACI) results

3.6 Key Sensitivity Test

A good encryption algorithm should be sensitive to secret key and plaintext, i.e., the change of single bit in the secret key or plaintext should cause a drastic change in the ciphertext [13]. Secure cryptosystems require high key sensitivity, which means that encrypted image should not be decrypted correctly even if there is only a small difference between encryption and decryption keys. Let \(C_{1}\) and \(C_{2}\) be two different ciphertext images whose corresponding keys differ by only one bit. The percentage difference between two ciphertext images are calculated, whose corresponding keys differ by one bit only. Simulation results are depicted in Table 9, which shows that all encryption schemes have good results for key sensitivity. By changing just one bit in the key, more than 99 % changes occur in encrypted images. The schemes based on chaotic maps have better results than non-chaotic maps.

Table 9 Difference of two ciphers when keys differ by one bit

4 Conclusion

In this paper, four techniques have been presented for security evaluation. Results have been carried out to analyze which technique is better to transmit multimedia data over a medium more securely. Comparison analysis was done on the basis of security parameters like correlation coefficient, information entropy analysis, encryption quality, NPCR, UACI, MSE and key sensitivity test.

In correlation coefficient analysis, results show that correlation for vertical and diagonal adjacent pixels is close to zero, i.e., minimum as needed for all schemes. Except CFES, correlation of horizontal adjacent pixels is also minimum for all schemes. Less correlation values of an encrypted image indicates higher security. Entropy values for CFES were less as compared to AES, CCCMES and BMBES. All three schemes having higher entropy values are more secure and possess resistive properties against entropy attacks. Maximum deviations of AES and BMBES have higher values as compared to other two schemes. AES and BMBES are very secure with respect to maximum deviation parameter. The values of irregular deviation parameter are less for CCCMES which indicates that it is better than AES, CFES and BMBES. BMBES has smaller value for deviation from uniform histogram. In the case of BMBES, the lower value of deviation from uniform histogram represents better encryption quality because the lower value points out that the histogram of cipertext image is less deviated from uniform histogram.

Diffusion characteristic of cryptosystem is an important parameter for comparison of different encryption schemes. For this purpose, avalanche effect test is performed in terms of NPCR, UACI and MSE, respectively. From results obtained through NPCR, UACI and MSE, we observed that all encryption schemes show significant differences for small changes, i.e., they all have values of MSE >30 dB. As compared to other schemes, CCCMES has higher values of MSE, NPCR and UACI. The values of MSE for CFES are approximately 34 dB which means that by changing one bit in plaintext, the difference between ciphertext images is not high. The key sensitivity test points out that more than 99% changes occurs for different keys in all schemes.