1 Introduction

Biometric recognition has advanced rapidly, and it is now virtually and universally utilized. Biometric recognition approaches identify and check unique characteristics precisely, quickly, and properly to govern the access process in specialized applications (Arqub and Abo-Hammour 2014; Soliman et al. 2021a; Abo-Hammour et al. 2013; Algarni et al. 2020a; Ibrahim et al. 2020a). Therefore, restricting access and blocking attackers from damaging or identifying the templates are critical.

The two types of biometrics utilized by humans are physical and logical biometrics (Alarifi et al. 2020; Ibrahim et al. 2020b). The face, iris, retina, palmprint, and fingerprint have specific physical characteristics, but logical or behavioral characteristics, such as voice, signature, keystroke patterns, and walking style, are measured by the behavior of the body and its response to various conditions.

All of these biometric represent aspects or characteristics of our human body, which are used to ensure that no hackers may get admission to manage the services provided (El-Shafai et al. 2023a, 2022a; Elazm et al. 2023). Passwords have historically been used to protect the key for encryption from being stolen or hacked. Most people employ the same passwords across many applications and never change them as the utilization of different lengthy passwords is difficult. If an attacker attempts to get access to the database and a portion of the secret password is stolen, the  other services may be at risk (El-Shafai et al. 2021a).

Organizations continuously look forward to preserve their data safely and improve the network service to prevent unauthorized access. Authentication and identification are performed to ensure that the permitted user can only access a secure and reliable location. Traditional encryption mechanisms, such as personal identification numbers (PINs) as well as passwords, have been used for years. For increased security, magnetic cards and PINs have been implemented (Faragallah et al. 2020; El-Shafai et al. 2023b; Hassan et al. 2022).

Some problems of the conventional methods arise from the fact that they recognize some characteristics related to the owner rather than the owner himself. The system can simply admit or control any outsider if these tokens are stolen or destroyed. There is a new methodology in verification strategies that uses biometrics in many applications, such as government services, commercial applications, and forensic evidence applications that rely on human being supervision to identify biometrics (Nassar et al. 2023; Almomani et al. 2023).

Biometric templates should be guarded when an application requires a high degree of secrecy. This in turn increases the accuracy and secrecy of identifying people (Elazm et al. 2023; El-Shafai et al. 2022a). As illustrated in Fig. 1, the biometric system is divided into four levels: image sensor, processing unit, database, and output device (Ahmad et al. 2022). In the enrollment stage, a biometric system is learned to recognize a specific client. The client initially presents an identity, like an identity card. The biometric (e.g., fingers, hand, face, or iris) is then presented to the input (sensor) device. The characteristics or features are obtained and saved as a reference pattern for the next examinations. After enrollment, the next step is to confirm that the client is the person who enrolled. After the user supplies whatever identity he or she registered with, the captured biometric is presented, and a new template is generated. The system next verifies the new biometric pattern to the individual's reference pattern, which was saved upon enrolment, to see if the new and stored patterns are identical.

Fig. 1
figure 1

Biometric system

Biometric-based authentication systems have provided powerful security characteristics, particularly in telemedicine applications, to protect personal data against offline password threats (Ahmad et al. 2022). Even though biometrics solve numerous security problems, hackers have discovered strategies to attack biometric authentication processes and databases containing biometric information. Biometrics are difficult to secure since the hashing strategy used to safeguard passwords does not operate with biometric information. As a result, any business that uses biometrics to authenticate its members must guarantee that their biometric information is appropriately protected (Salama et al. 2022; Almomani et al. 2022a). Biometric encryption approaches provide excellent confidentiality, security, and uniqueness for permitted persons. Encryption keys boost the security of biometric cryptosystems. Original biometric characteristics are not stored directly in the server with these cryptosystems but are first processed and turned into distorted, transformed templates (El-Shafai et al. 2022b, 2022c).

Cancelable biometrics (El-Shafai et al. 2022d) play a vital role in biometric security by modifying the basic biometric into a new irreversible domain. Instead of saving the basic characteristics directly into the server, cancelable biometrics are first generating by deform the biometric characteristics using various modification functions as shown in Fig. 2.

Fig. 2
figure 2

Example of a cancelable biometric template generated by a transformation function

These cancelable templates have four significant properties (Ibrahim et al. 2020a): (i) diversity, (ii) non-invertiblity, (iii) revocablity, (iv) high performance. Diversity typically indicates that no two users should be assigned the same cancelable biometric pattern. Non-invertiblity indicates that original biometric characteristics cannot be retrieved using these cancelable biometric patterns. Revocablity indicates that if an individual's cancelable biometric pattern is stolen, a new cancelable pattern will be assigned to him/her without affecting the actual biometric identity. Finally, the high performance property specifies that the recognition accuracy of a cancelable biometric system should be as high as that of a standard biometric system. If one's allocated cancelable biometric template is hacked, a new template will be assigned to him by deleting the prior one, as is the case with ATM passwords.

The motivation behind the development of this work is to address the security and privacy concerns associated with biometric recognition systems. Biometric recognition systems are widely used for identification and authentication purposes, but they have a vulnerability to attacks such as spoofing and replay attacks, which can compromise the security of the system. The proposed system uses 3D optical chaos and DNA encryption to enhance the security of biometric recognition systems. The system is cancellable, which means that the biometric data can be modified without affecting the recognition performance, thus protecting the privacy of the users. The use of 3D optical chaos and DNA encryption ensures that the biometric data cannot be easily replicated or tampered. Overall, the motivation behind the development of this system is to provide a more secure and privacy-preserving biometric recognition system that is robust against various attacks and can be used in a wide range of applications, including access control, surveillance, and forensics.

In this research work, a new authentication system built on 3D chaotic maps, Piecewise Linear Chaotic Map (PWLCM), Logistic Map, and finally, Deoxyribonucleic acid (DNA) sequence theory is suggested. First, a 3D chaotic map is made up of the following processes: (1) 3D chaos creation, (2) chaos histogram equalization, (3) row rotation, (4) column rotation, and (5) XOR process. Secondly, PWLCM and Logistic Map are used for generating all of the main values required by DNA theory to generate the key image. Thirdly, DNA theory is applied as a second encryption algorithm to the output of the 3D chaotic algorithm. Fourthly, the intermediate image is decoded and used in the following step in the DNA encoding process. Finally, step 3 is repeated on columns to acquire the final encrypted image.

The rest of this paper is organized as follows: Sect. 2 includes some past works. Section 3 presents the suggested cancelable biometric authentication architecture. Section 4 gives an explanation of the assessment metrics that were used. Section 5 summarizes the computer simulation results and performance evaluations. Section 6 summarizes the final remarks.

2 Related work

Cancellable biometric methodologies are utilized in the verification system to provide distorted versions of biometrics (El-Hameed et al. 2022; Helmy et al. 2022a). If required, authorized features can be removed or replaced in hacking cases. The cancelable biometric strategy is devoted to preserving the highest precision level of the saved biometrics to improve user privacy.

Mohamed et al. (2022) developed an iris recognition authorization strategy. It is dependent on a mix of non-invertible transformations with encryption to disguise the iris template. They achieved a 99.9% recognition rate. Various security strategies for face authentication are provided in Ayoup et al. 2022a. They employed a variety of procedures to extract geometric characteristics. Another proposed approach is based on creating concealed templates after using Gabor filters. This approach presented two types of chaotic maps: logistic and modified logistic maps. With the chaotic logistic map, this methodology achieved 99.08% accuracy and 1.175% EER. Also, the genetic (GA) encryption algorithm (Arqub and Abo-Hammour 2014; Abo-Hammour et al. 2013) can be used to attain biometric image cancelability (Alshammri et al. 2022). The GA (Abo-Hammour et al. 2014; Abu Arqub et al. 2012) method chooses the appropriate secret key with the best length to increase encryption. The researchers of Soliman et al. (2018a) described a face biometric authentication method based on Double Random Phase Encoding (DRPE) algorithm. The bio-convolving approach was used to safeguard the safety and privacy of the individuals' faces. In addition, the same researchers (Soliman et al. 2018b) proposed a multi-biometric authentication architecture.

It is focused on combining multiple biometric pattern characteristics. A one-way iris pattern is created using the Fractional Fourier Transform (FRFT)-based approach. The cryptographic keys RPM1 and RPM 2 are employed in the described cancelable biometric strategy. RPM1 is the first cover for the left iris characteristic vector, and RPM2 is for the right iris feature pattern of the individual. The implemented strategy achieved an EER of 0.63% and an efficiency of 99.75%.

El-Hameed et al. (2022) proposed a cancelable geometric methodology for fingerprint recognition. It works by deducing feature pixels and then using polar and Cartesian coordinate processes to generate cancelable fingerprint characteristics. This method maintains cancelability while achieving high privacy and security accuracy. The researchers of Helmy et al. (2022a) presented a fingerprint recognition security technique based on numerous spiral curves and fuzzy principles. The equal error rate (EER) of this method is 1.17%.

The researchers of Hammad et al. (2019) used a 2-dimensional Gabor filter to extract features of a palmprint, followed by a two-dimensional palm Hash algorithm to mask the features and construct the cancelable palmprint vector. Leng et al. (2014), a method for multi-biometric characteristics is used to construct cancelable biometrics in order to obtain high privacy and secrecy depending on different feature fusion degrees. Qiu et al. (2019), a new palmprint pattern protection system is developed based on random comparison and noise data. An anisotropic filter (AF) is used to collect the palmprint's orientation information. The palmprint's orientation characteristic is then assessed using a chaotic matrix, resulting in a protected and cancelable palmprint pattern. Furthermore, as the last cancelable palmprint pattern, noise data having independent and identically dispersed distribution is included to improve the template's privacy security. In addition, a method for producing cancelable palmprint patterns was developed using coupled nonlinear dynamic filters and various orientation palm codes (Qiu et al. 2019). In conclusion, cancelable biometric formats can be created using a variety of techniques, including random projection (Soliman et al. 2021b; Asaker et al. 2021; El-Gazar et al. 2022), hashing function (YYYY xxxx), and salting (Almomani et al. 2022b), but they all have the same basic goal of transforming the original biometric pattern into a distorted version, attempting to make it complicated for an enemy to recover the original template from the distorted one. Cancellable template models have been used to biometric characteristics from numerous attributes, such as the face and palmprint, to create several safe biometric architectures.

3 Proposed cancelable biometric authentication framework

In this session, we proposed a Cancellable biometric methodology based on two hybrid encryption techniques: pixel rotation with XOR-based encryption technique using 3D chaos and DNA theory as a second encryption algorithm to the output of 3D chaotic algorithm to provide distorted versions of biometrics. The 3D chaotic system is utilized for position permutation and the value transformation approach. In the position permutation strategy, the pixel position is permuted without affecting the pixel value of the actual image, but in the value transformation strategy, the pixel value is changed by another pixel value without modifying the position as in Helmy et al. (2022b). Figure 3 depicts the flowchart of the recommended cancelable biometric authentication architecture. The suggested framework's major stages are illustrated and summarised as follows:

  1. (a)

    3D Chaos generation.

  2. (b)

    Chaos histogram equalization.

  3. (c)

    Row rotation.

  4. (d)

    Column rotation.

  5. (e)

    XOR operation.

  6. (f)

    Piecewise liner chaotic map and logistic map.

  7. (g)

    MD5 hash.

  8. (h)

    Deoxyribonucleic acid string (DNA).

  9. (i)

    DNA cryptosystem.

Fig. 3
figure 3

The proposed cancelable biometric system

3.1 3D chaos generation system

The logistic chaos map is used to generate 3D chaos in the proposed chaotic-based cryptosystem. The logistic map is one of the most basic and well-known chaotic maps, based on changing the placements of pixels in images and adjusting gray-level pixel values. An equation that describes the nonlinear logistic map chaos generation system is as follows:

$$x_{n + 1} = \mu x_{n} \left( {1 - x_{n} } \right),$$
(1)

where \(0 < x_{n} < 1\) and \(\mu = 4\) is the condition to do this formula chaotic. The 3D logistic map chaos generation expressions are formulated as in Helmy et al. (2022b) and Ayoup et al. (2022):

$$x_{n + 1} = \gamma x_{n} \left( {1 - x_{n} } \right) + \beta y_{n}^{2} x_{n} + \alpha z_{n}^{3} ,$$
(2)
$$y_{n + 1} = \gamma y_{n} \left( {1 - y_{n} } \right) + \beta z_{n}^{2} y_{n} + \alpha x_{n}^{3} ,$$
(3)
$$z_{n + 1} = \gamma z_{n} \left( {1 - z_{n} } \right) + \beta x_{n}^{2} z_{n} + \alpha y_{n}^{2} ,$$
(4)

where \(3.53 < \gamma < 3.81, 0 < \beta < 0.022 ,0 < \alpha < 0.015\), and the preliminary values of \(x,y,z\) in between 0 and 1. These parameters make the 3D logistic map more complicated and secure than 1D or 2D logistic maps.

3.2 Chaos histogram equalization

The histogram of X, Y, and Z has non-uniform distribution, as illustrated in Fig. 4a–c. The histogram should be equalized to raise the degree of security. The following equations are used to equalize the histogram for a grey image of \(MXN\) dimensions:

$$x = \left( {integer\left( {x \times N_{2} } \right)} \right)modN,$$
(5)
$$y = \left( {integer\left( {y \times N_{4} } \right)} \right)modM,$$
(6)
$$z = \left( {integer\left( {z \times N_{6} } \right)} \right)mod256,$$
(7)

where \(N_{2} ,N_{4} , N_{6}\) are huge random numbers more than 10,000. We may also assume \(N_{2} , N_{4} , \;{\text{and}} N_{6}\) to be equal for the simplicity. Using \(N_{2} = N_{4} = N_{6} = 100,000,M = 256,N = 256\), Fig. 4d–f depict the equalized histogram.

Fig. 4
figure 4

Histogram Equalization of the 3D chaos system

3.2.1 Row rotation

Depending on the value of chaos \(x\) from Eq. (5), Helmy et al. (2022b) proposed a novel strategy for row rotation that would aid us in image pixel permutation. When the chaos value is even, the rows must be rotated left; when the chaos value is odd, the rows must be rotated right, as shown in Fig. 5.

Fig. 5
figure 5

Row rotation process

3.2.2 Column rotation

Column rotation and row rotation have a lot in common. The value of chaos Y determines a column rotation in Eq. (6). When the chaos is an even number, we must rotate the columns up, and when the chaos is an odd value, we must rotate the columns down (Helmy et al. 2022b), as shown in Fig. 6.

Fig. 6
figure 6

Column rotation process

3.2.3 XOR operation

The XOR operation is the final stage in this encryption procedure. First, the pixel value is changed to a new value via the XOR operation, which cannot be reversed without having the chaos key. Next, we XOR the chaos and row-column shifted the image to produce the encrypted image.

3.2.4 Piecewise liner chaotic map and logistic map

The Piecewise Linear Chaotic Map (PWLCM) and Logistic Map procedures are utilized to generate all of the parameters required by the approach. PWLCM offers various advantages, according to Ayoup et al. (2022), Eldesouky et al. (2022) and Faragallah et al. (2022), such as its representation simplicity and execution efficiency. It is utilized to create the key image, whereas the logistic map is used to govern the processes (operations) or DNA standards (rules) chosen for encoding and to reconstruct every row of the basic image. Equation (8) defines the PWLCM, whereas Eq. (9) represents the logistic map.

$$X_{n + 1} = F_{p} \left( {X_{n} } \right) = \left\{ {\begin{array}{*{20}c} {X_{n} /p} & {0 < X_{n} < p} \\ {{\raise0.7ex\hbox{${\left( {X_{n} - p} \right)}$} \!\mathord{\left/ {\vphantom {{\left( {X_{n} - p} \right)} {\left( {0.5 - p} \right)}}}\right.\kern-0pt} \!\lower0.7ex\hbox{${\left( {0.5 - p} \right)}$}}} & {p \le X_{n} < 0.5} \\ {F_{p} \left( {1 - X_{n} } \right)} & {0.5 \le X_{n} < 1} \\ \end{array} } \right\},$$
(8)

where \(X_{n} \in \left( {0,1} \right)\) and \(p \in \left( {0,0.5} \right).\) Assume that \(p = 0.25678900.\)

$$X_{n + 1} = \mu X_{n} \left( {1 - X_{n} } \right),$$
(9)

where \(X_{n} \in \left( {0,1} \right)\) and \(\mu \in \left( {0,4} \right]\). Assign the value of µ = 3.99999999 (Ayoup et al. 2022b).

3.2.5 MD5 hash

MD5 (Message-Digest Mechanism 5) is a widely used encryption algorithm that generates a 128-bit hash value from the data input, which is generally a 32-digit hexadecimal integer (El-Shafai et al. 2022e). The initial values of chaos maps are determined by the MD5 encryption estimated real value, as measured by Eq. (10):

$$X_{0} = \bmod \left( {d_{1} \oplus d_{2} \oplus d_{3} \oplus d_{4} ,256} \right)/255.$$
(10)

The chaotic map's beginning number is \(X_{0}\), and the estimated number of \(X_{0}\) can either be 0 or 1 with a certain probability; if this happens, discard it and use Eq. (10) to get another. The parameters \(d_{1} ,d_{2} ,d_{3} ,d_{4}\) are calculated from the fundamental image's MD5 hashing result. The hash estimated value of the basic image is divided into four 32-bit bits. \(d_{1} ,d_{2} ,d_{3} ,d_{4}\) are the first 32 bits, each of which represents a single byte. Before using Eq. (10), just convert \(d_{1} ,{ }d_{2} ,{ }d_{3} ,d_{4}\) from 1 and 0 s to decimal.

3.2.6 Deoxyribonucleic acid string (DNA)

DNA is divided into four separate nucleic acid bases: adenine (A), thymine (T), cytosine (C), and guanine (G), as well as other required components. A and T, as well as C and G, create complementary sequences (Eldesouky et al. 2022). The complementing rule is satisfied by eight rules. The fundamental image is encoded using four separate nucleo bases that follow eight different standards. The decimal number 201, for example, has a binary equivalent of '11001001'. '11001001' is encoded following to DNA principles, resulting in eight different forms: 'TACG', 'TAGC', 'ATCG', 'ATGC', 'CGTA', 'CGAT', 'GCTA', and 'GCAT’.

Furthermore, to encrypt the data, use numerous DNA sequence techniques like addition, subtraction, and exclusive OR (XOR). Tables 1, 2, 3 and 4 outline the DNA match rules and processes (Ayoup et al. 2022; Eldesouky et al. 2022; Faragallah et al. 2022).

Table 1 Eight encoding rules of DNA
Table 2 XOR process for DNA sequence
Table 3 Addition (+) process for a DNA sequence
Table 4 Subtraction (−) process for a DNA sequence

3.2.7 DNA cryptosystem

In the DNA cryptosystem, we should apply the following steps:

  1. 1.

    Use Eqs. (8) and (11) to produce the key image

    $${\text{pixel}} = \left[ {{ }X{ } \times { }256} \right],$$
    (11)

    The recurrence value for PWLCM is \({\text{X}} \in \left( {0,1} \right).\) Equation (10) determines the starting value of Eq. (8). Neighboring pixels in the key image are likely to be unrelated to one another. Pixels produced from a chaotic map are a great way to achieve these requirements. The predicted value of the next pixel generated differs from the current one.

  2. 2.

    The FECT and key images are encoded on rows using the DNA standards defined by Eqs. (9) and (12).

    $${\text{Rule}}\left( {{\text{standard}}} \right) = \left[ {{\text{X}} \times 8} \right] + 1,$$
    (12)

    The initial result of Eq. (9) is obtained from Eq. (10). Table 1 contains a list of the eight DNA standards. Each pixel in a row is encoded with one of the DNA standards, and each row uses a new set of DNA standards until the entire image is encoded. Corresponding to DNA encoding standards, 8 bits of each image pixel are separated and encoded into 4 categories of nucleobases.

  3. 3.

    DNA operations are performed row by row between the Encoded (FECT) and the encoded key image. Equations (9) and (13) indicate the type of DNA procedures to be carried out. Tables 2, 3 and 4 outline the fundamentals of DNA processes.

    $${\text{process}}\left( {{\text{operation}}} \right) = \left[ {{\text{X}} \times 3} \right] + 1,$$
    (13)

    Different types of DNA operations (XOR, +, −) are conducted in a row-by-row way until the encoded intermediate representation image appears.

  4. 4.

    By using a decoding methodology on the past intermediate representation image, you would be able to obtain the decoded intermediate one. Equation (12) expresses the decoding procedure. Using this process, we may obtain a preliminary cipher representation.

  5. 5.

    The first cipher image is rotated 90 degrees anti-clockwise. In this case, we merely study images by rows and then by columns.

  6. 6.

    Repeat steps 2 through 4 for the columns to create a Final Encrypted cancelable biometric template.

4 Authentication evaluation metrics

The visual examination is important in assessing cancelable biometrics since robust encryption and great cancelability arise from highly coveted features in the recommended cancelable biometric cryptosystem. Quality assessment does not rely just on direct observation. As a result, numerous measurements are used to assess the progress of the cancelable biometric architecture. Correlation criteria assess the relationship between a recorded biometric pattern and a biometric input pattern. The greater the factor value, the more similar the templates will be. Access to the system is granted if the correlation score for a checked user exceeds a certain level. Mathematically, an authorised person's correlation ratio must be greater than a hacker attempting to get access to the database.

The receiver operating characteristic (ROC) curve may be used to assess the success of the presented cancelable biometric authentication system. The true positive factor (TPF) and the false-positive factor (FPF) are represented by the ROC curve as in Ibrahim et al. (2020b) and Hashad et al. (2020). The ROC curve idea is based on a decision variable. In every biometric identification system, the tested information includes genuine and false templates, allowing each pattern value to be dispersed around a given mean value.

As a result, the authentic template's mean value is greater than the false template's mean value. Furthermore, the encryption efficiency is evaluated by calculating correlation coefficients, histogram deviation, SSIM, MSE (Mean Square Error), PSNR (Peak Signal-to-Noise Ratio), Number of Pixel Change Rate (NPCR), UACI (Unified Average Changing Intensity), and histogram uniformity among protected and original biometrics. For cancelable biometric systems, the values of Correlation, PSNR, and SSIM should be at their lowest. But values for NPCR, MSE, and UACI should be at their highest levels. These authentication metrics will be explored in depth in order to assess the quality of the planned cancelable biometric authentication framework as follows:

4.1 Histogram analysis

The histogram depicts the distribution degree for every pixel strength in a biometric image. In the case of cancelable biometric technologies, the histogram must have both features for the protected biometric template, which is based on encryption methodology (Algarni et al. 2020b):

  1. 1.

    The cipher biometric image's histogram differs from the basic biometric image's histogram.

  2. 2.

    It must have an identical distribution, meaning all pixel values are distributed uniformly.

4.2 Correlation score

The correlation is a comparison of the biometric template and its distorted replica. It is calculated between two adjacent pixels of an image as in the following equation (Helmy et al. 2022b):

$$C_{i,j} = \frac{{cov\left( {i,j} \right)}}{{\sqrt {D\left( i \right)} \sqrt {D\left( j \right)} }},$$
(14)
$$D\left( i \right) = \frac{1}{M}\mathop \sum \limits_{ii = 1}^{M} \left( {i_{ii} - E\left( i \right)} \right)^{2} ,$$
(15)
$$cov\left( {i,j} \right) = E\left\{ {\left( {i - E\left( i \right)} \right)\left( {j - E\left( j \right)} \right)} \right\},$$
(16)
$$E\left( i \right) = \frac{1}{M}\mathop \sum \limits_{ii = 1}^{M} i_{ii} ,$$
(17)

where \(M\) is the number of pixels, \(C_{i,j}\) is the correlation coefficient, and \(i,j\) are grayscale values of two neighboring pixels in the basic image or ciphered image. The correlation assessment has two scenarios, which are detailed below:

  1. 1.

    When the correlation coefficient is near one, it explores the highest score, which occurs only when 2 biometric images are strongly correlated.

  2. 2.

    The \(C_{i,j}\) score is close to or equal to zero, indicating a considerable difference between the permitted biometric image and its encrypted form, where the ciphered biometric pattern is completely independent of the primary one.

4.3 The probability of true distribution (PTD) and false distribution (PFD)

In permitted access scenarios, the PTD reflects the probability distribution of the statistical correlation. The PFD displays the statistical correlation's probability distribution in spoofed access conditions. Only if a biometric test score exceeds a specified level (Equal Error Rate (EER)), which is calculated at the intersection of the imposter and real distributions, is access permitted to the system. As in El-Shafai et al. (2021b), the inaccurate reject and wrong accept errors are equivalent at this junction point.

4.4 The receiver operating characteristic (ROC) curve analysis

Receiver operating characteristic (ROC) analysis is a graphical method of evaluating the efficiency of the structure. The sensitivity (true positive rate) of a ROC curve is depicted as a mathematical function of the specificity (false positive rate) at different intersection locations as in Faragallah et al. (2021a).

4.5 Structural similarity index metric (SSIM)

It is a strategy for determining the similarity between two images. The SSIM is used as a measure for encryption efficiency in this research. A strong encryption technique should have SSIM values (the difference between the actual and encrypted pictures) near zero, as in El-Shafai et al. (2021b). It is described with the following equation:

$$SSIM = \frac{{\left( {2\mu_{I} \mu_{T} + S_{1} } \right)\left( {2\sigma_{IT} + S_{2} } \right)}}{{\left( {\mu_{I}^{2} + \mu_{T}^{2} + S_{1} } \right)\left( {\sigma_{I}^{2} + \sigma_{T}^{2} + S_{2} } \right)}},$$
(18)

where \(\mu_{I} \;{\text{and}}\;{ }\mu_{T}\) are the averages of an input image (I) and the encrypted image \(T\), respectively, \(\sigma_{I}^{2}\) and \(\sigma_{T}^{2}\) correspond to the variances of I and \(T\), respectively, \(\sigma_{IT}\) signifies the covariance between them, and finally \(S_{1}\) and \(S_{2}\) are small values.

4.6 Statistical histogram deviation (\({\varvec{D}}_{{\varvec{H}}}\))

This parameter is used to assess the recommended cryptosystem's ciphering strength by determining the highest amount of deviations among the histograms of pure and ciphered images as in Alqahtani et al. (2022). It can be determined using Eq. (19):

$$D_{H} = \frac{{\left( {\frac{{q_{0} + q_{255} }}{2} + \mathop \sum \nolimits_{i = 1}^{254} q_{i} } \right)}}{M \times N},$$
(19)

where \(q_{i}\) is the absolute difference between the pure and ciphered images' estimated histograms at a gray level i. Hence, the greater the magnitude of \(D_{H}\), the greater the deviation of the cipher image from the main image.

4.7 PSNR metric

The peak signal-to-noise ratio (PSNR) is employed to assess the suggested cryptosystem’s strength in the case of channel noise. It is assessed among the actual and ciphered images. PSNR is an abbreviation for the ratio of a signal's highest possible value (power) to the strength of distorting noise that influences the quality of its representation. The value of \(PSNR \in \left[ {0,\infty } \right]\). The PSNR ratios between the cypher image and the basic image should be low for an efficient encryption procedure, as in Alqahtani et al. (2022). The PSNR is calculated analytically as indicated in Eq. (20):

$$PSNR = 10\log_{10} \frac{{\left( {2^{n} - 1} \right)^{2} }}{MSE},$$
(20)

where n is the number of bits/pixel, and MSE is the Mean Squared Error. The value of PSNR ϵ [0,∞]. MSE can be calculated by the following equation:

$$MSE = \frac{1}{M \times N}\mathop \sum \limits_{x = 1}^{x = M} \mathop \sum \limits_{y = 1}^{y = N} \left[ {O\left( {x,y} \right) - T\left( {x,y} \right)} \right]^{2} ,$$
(21)

where O is the reference image, \(T\) is the encrypted image.

4.8 Number of pixel change rate (NPCR)

The NPCR measure is defined as a percentage of different pixel amounts between two ciphered images. If the ciphering procedure produces a higher NPCR value, the ciphering technique given is resistant to differential attacks. The NPCR metric is calculated as shown in (22) and (23) as in El-Shafai et al. (2022d):

$$NPCR = \frac{{\mathop \sum \nolimits_{i,j} g\left( {i,j} \right)}}{M \times N} \times 100$$
(22)
$$g\left( {i,j} \right) = \left\{ {\begin{array}{*{20}c} {0 O\left( {i,j} \right) = T\left( {i,j} \right)} \\ {1 otherwise} \\ \end{array} } \right.$$
(23)

\(g\left( {i,j} \right)\) denotes the difference between related pixels of the plain image and the encrypted image. The estimated NPCR value of a ciphered image must be near to 100.

4.9 Unified average changing intensity (UACI)

It is employed to measure the intensity value difference between two images. The value of UACI in cancelable biometric technology must be high as in El-Shafai et al. (2022d). It can be computed as in Eq. (24):

$$UACI = \frac{1}{M \times N}\left( {\mathop \sum \limits_{i,j} \frac{{\left| {O\left( {i,j} \right) - T\left( {i,j} \right)} \right|}}{255}} \right) \times 100$$
(24)

5 Simulation results and discussions

The images used in the study are \(256 \times 256\) images with an 8-bit per pixel resolution. The suggested cryptosystem has the benefit of being able to deal with any sort of image (grayscale/color) of varying sizes and characteristics. The proposed cryptosystem is implemented on the Intel(R) processor Core (TM) i5-10210U CPU @ 1.60 GHz, an operating system of Windows 10, 8.00 GB RAM. The experimental results are obtained using MATLAB R2019a. The simulation analyses are reviewed visually and by various criteria for an efficient evaluation of the proposed cryptosystem, such as histogram uniformity, correlation coefficient, histogram deviation, Structural Similarity (SSIM) Index, Peak Signal-to-Noise Ratio (PSNR), FAR (False Acceptance Rate), and AROC. We examine two various datasets of faces (UFI database-LFW deep funneled) and ROI palmprint dataset to evaluate the recommend cancelable biometric authentication architecture. In simulation studies, the first two trial samples are for 12 different biometric faces of various persons, while the remaining samples are for different palmprint images of various users, as shown in Figs. 7, 8 and 9.

Fig. 7
figure 7

The tested twelve biometric images of the LFW-deep funneled database

Fig. 8
figure 8

The tested twelve biometric images of the UFI database

Fig. 9
figure 9

The tested twelve biometric images of the ROI palmprint database

The recommended cryptosystem's ciphering performance for each of the evaluated biometric samples is shown in Figs. 10, 11 and 12. From the perspective of a visual encryption assessment, the suggested cancelable biometric architecture achieves high performance. Tables 5, 6, 7 and 8 present Correlation coefficients of neighboring pixels in the tested twelve biometrics images and their ciphered of all examined databases.

Fig. 10
figure 10

The cancelable biometrics for the recommended cryptosystem for the LFW-deep funneled database

Fig. 11
figure 11

The cancelable biometrics for the recommended cryptosystem for the UFI face database

Fig. 12
figure 12

The cancelable biometrics for the recommended cryptosystem for the ROI palmprint database

Table 5 Correlation coefficients of neighboring pixels in the tested twelve biometrics images and their ciphered of LFW deep funneled database
Table 6 Correlation coefficients of neighboring pixels in the tested twelve biometrics images and their ciphered of UFI database
Table 7 Correlation coefficients of neighboring pixels in the tested twelve biometrics images and their ciphered of the ROI palmprint database
Table 8 Correlation and SSIM values for the twelve biometrics images of the LFW deep funneled database

Distortion and encryption so that the actual biometrics may be kept securely in a cloud server. Figures 13, 14 and 15 show the ROC, PFD, and PTD curves of the authentication stage for the proposed encryption technique for all analyzed biometric samples. The PFD and PTD intersection point specifies the threshold crossing rate, which is then used to verify whether or not this individual is a real user.

Fig. 13
figure 13

The authentication outcomes of the cryptosystems on the LFW deep funneled dataset

Fig. 14
figure 14

The authentication outcomes of the cryptosystems on the UFI database

Fig. 15
figure 15

The authentication outcomes of the cryptosystems on the ROI palmprint database

Figures 16, 17 and 18 show the histograms of the original and ciphered biometrics for the offered cryptosystem of all tested biometric datasets. The recommended cryptosystem is shown to provide relatively uniform and flat histogram results, demonstrating its better decision-making. Hence, applying this cryptosystem on the ROI palmprint biometrics database histogram, for example, achieves security features and makes this database secure from intruders. Two biometrics images have been examined for all simulation testing in the verification process. Both belong to different users—one is the real user, and the other is an impostor. To assess the performance of the proposed encryption strategy, the correlation and SSIM values are checked between the two examined encrypted patterns (false and genuine) and the real stored secured biometric patterns. Our research considers that the physical environment contains some noise that might impact the saved or tested biometric patterns. Therefore, noise is a factor in all experimental data.

Fig. 16
figure 16

Results of the proposed cryptosystem's histogram for the original and cipher biometrics of the LFW deep funneled dataset

Fig. 17
figure 17

Results of the proposed cryptosystem's histogram for the original and cipher biometrics of the UFI database

Fig. 18
figure 18

Results of the proposed cryptosystem's histogram for the original and cipher biometrics of the ROI palmprint database

Tables 8, 9 and 10 present the results of the fake/genuine correlation analysis and the SSIM evaluation for a selection of patterns from the three biometrics datasets. In Table 8, the average correlation/SSIM values of the LFW deep funneled database in the case of the false face are \(0.0162{ }\) and \(8.033 \times 10^{ - 3}\) respectively, while in the case of the genuine face, the average correlation/SSIM values are \(0.9051\) and \(0.9004,\) respectively. In Table 9, the average values of the UFI database correlation/SSIM for the false face are \(3.85 \times 10^{ - 3}\) and \(0.0101,\) respectively, whereas the average values for a genuine face are shown as \(0.9055\) and \(0.9008,\) respectively. In Table 10, the average ROI palmprint database correlation/SSIM values for the false face are \(0.0150\) and \(8.15 \times 10^{ - 3}\) respectively, even though the average values for a real face are \(0.9050\) and \(0.9003,\) respectively. This indicates that palmprint authentication is successful because the results for a real person are close to one, while the results for a false person are close to zero. According to the outcomes in the tables, the proposed technique produces the highest correlation coefficients and SSIM scores in genuine biometrics enrollment, while it achieves the lowest results in false biometrics enrollment. Several criteria have been studied to ensure the efficacy of the proposed cipher scheme, such as FAR, AROC, PSNR, and Histogram Deviation (\(D_{H}\)). Table 11 shows the mathematical evaluation of the FAR, AROC, Histogram Deviation, and PSNR.

Table 9 Correlation and SSIM values for the twelve biometrics images of the UFI database
Table 10 Correlation and SSIM values for the twelve biometrics images of the ROI palmprint database
Table 11 FAR, Histogram Deviation, PSNR, and AROC of the proposed cancelable system for all the datasets

According to Table 11, the values of PSNR between the original image and the ciphered image for all the datasets range from (7.5473 to 9.2309), indicating the proposed algorithm's effectiveness. Tables 12, 13 and 14 show the current influence of various Gaussian noise variances on the evaluated biometrics for the suggested cancelable approach, with average FAR, PSNR,\(D_{H}\), and AROC values. The measured FAR, \(D_{H}\), PSNR, and AROC values demonstrate that the proposed architecture has a low noise sensitivity.

Table 12 All the metrics (FAR, \(D_{H}\), PSNR and AROC) of the LFW deep funneled dataset at different noise analyses
Table 13 All the metrics (FAR, \(D_{H}\), PSNR and AROC) of the UFI dataset at different noise analyses
Table 14 All the metrics (FAR, \(D_{H}\), PSNR and AROC) of the ROI palmprint dataset at different noise analyses

In Table 15, further trials are done to test the performance success of the recommend authentication approach with the traditional authentication approaches (El-Shafai et al. 2022d, 2021c; Nagar et al. 2010; El-Shafai and Hemdan 2021; Badr et al. 2021; Faragallah et al. 2021b) in order to further evaluate the overall effectiveness of the discussed cryptosystem for creating an efficient cancelable biometric authentication mechanism.

Table 15 Comparison between the proposed cryptosystem and the traditional authentication approaches

6 Conclusion and future work

This research studied an enhanced encryption technique for generating and constructing a more secure cancelable biometric authorization system. In order to provide a strong cancelable biometric identification system, this proposal's key contribution combines 3D chaotic maps, Piecewise Linear Chaotic Map (PWLCM), and Logistic Map with DNA sequence theory. This technique is constructed on confusion as well as diffusion approach. First, the 3D chaotic map is based on the following steps: (1) 3D Chaos creation, (2) chaos histogram equalization, (3) row rotation, (4) column rotation, and (5) XOR process. Secondly, PWLCM and Logistic Map are used for generating all of the main values required by DNA theory. Thirdly, DNA theory is applied as a second encryption algorithm to the output of the 3D chaotic algorithm. So, the proposed framework hides the whole discriminative properties of biometric templates, resulting in a fully unspecified biometric pattern. According to research, numerous biometric datasets may be encrypted using the specified cryptosystem technique. It also demonstrates its ability to protect biometric patterns compared to conventional approaches. The proposed cancelable biometric approach exhibits an average FAR of \(6.2 \times 10^{ - 3}\), an average \(D_{H}\) of 0.8755, an average AROC of approximately 1, and an average PSNR of 8.2061. We recommend developing multi-biometric templates based on hybrid encryption techniques for future work to improve efficiency and secure biometric pattern storage against threats.