1 Introduction

We have been involved in many interesting problems related to structural analysis in a variety of biological signals, some of which are presented in this paper. The solutions reached for diverse problems and methods, share a large common set of methods and algorithms in mathematical characterization and definition of invariants of biological structurality, which proved to be substantial in conceptual and methodological evolution towards deeper understanding of biological reality and increased operational needs.

Some of the problems were easily modeled, with direct visual monitoring of the related semantics, with the structures in focus being of geometric nature, with WYSIWYG (what you see is what you get) method features—semantics being directly visualized and tested in real time. Beside this corpus of directly visualized structurality, there are numerous problems whose complexity masks some of important signal components in such a way that they are away of visual inspection and direct geometric modeling. Many of these problems are well tuned for the application of Fourier analysis, where WYSIWYG is an essential need. In fact, this so remarkable method, with hardly any rival on the horizon, that, even, whenever there is a component within the signal showing some organization in time, the major tool which might uncover the way of process component vise functioning is Fourier analysis. The amplitude and frequency modulations, as the simplest means of information transmission and process control, are broadly present in biological systems.

The Fourier spectroscopy implementations are all based on theory of Fourier developments either on classic Fourier, or any other suitable orthonormal base. The orthonormality is rather important. Developments using bases without this property are filled with extra harmonics with nonintuitive semantics, and the proportions of amplitudes are not preserved. This is why this property is so necessary, though, sometimes, researchers are dealing with systems of functions which are even linearly dependent, e.g. Blinowska in [1]. On the other hand the method is not sufficient and we are facing a kind of game earlier happening in quantum mechanics: interference of method with data is live. What we do not have in Mathematics, we have here: the semantics relies and depends on the interference of experimental parameters, sampling rates, spectral resolution, the set of time bases of emerging pulses, stability of their amplitudes and frequencies, or the relative speed of their changes. In signals with critical complexity, coupled semantics becomes unstable and tends to fall apart. Here we are within Mathematics complexity which loses its usual analytic integral formalism and have to deal with intuition guided experimenting Mathematical environment, taking care of unstable semantic components, since the large and short becomes easier smaller than small and long, as simplest examples when semantic instability occurs. The small involved parameter changes might cause semantic singularity. That has to be controlled.

Henceforth, important issues in signal spectroscopy like spectral recalibration, time Fourier spectroscopy, spectral feature types, enhancements of negligible but important structures and their morphology aspects are treated here too. The small and elusive might be of crucial importance.

The spectral line in a signal x, corresponds to a coefficient in the Fourier development of the element x in a Hilbert space X,

$$ x = \sum\limits_{{i \in I}} {\hat{x}_i x_i } $$

where coefficients \( \hat{x}_i = \left( {x,x_i } \right),\,i \in I \), are projections to the orthonormal base vectors x i in the base set B, obtained as inner products, which are always scalars, hence constants. We are mainly dealing with power spectra.

Always, one important aim is to automatize all procedures involving works with features, which is essentially reached in all our implementations. The basis for it is described in the second section. Here presented methods have diverse application in experimentation with biological signals. In the third section the selected examples obtained with our systems in the labs from our cooperation, are used as method illustration. Our implementations and data from presented experiments are available at our web site www.matf.bg.ac.yu/~aljosha

2 Method

We investigate signals with noticeable geometric or random features and signals containing informational structures of discrete sort as well. Not seldom, we are dealing with signals resulting from interferences of various sorts, often wishing to uncover certain individual sources. In such situations the recorded signals have less desirable components/contents which are usually referred to as contamination or noise.

There are elementary cases where in the focus of investigation are geometric features which are directly observable in a signal shape, some examples available in [25]. Then certain varieties of classes of shapes, with signal component extraction and classification are of major importance. That kind of features is dominant in the analysis of ECG, epileptic seizures, firing of individual neuron, photometry of microscopic images and some other simple signals. As a representative example, we will shortly discuss the simple shapes present in photometric signals of microscopic images of chromosomes and requirements of their analysis, other examples are shown in the next section. With CCD we work with two dimensional photometry. Taking longitudinal sections of chromosomes we obtain rather simple signals of polynomial sort, which characterize individual chromosomes. Geometric properties of these photometric signals are used to determine mathematical invariants of investigated material—chromosomes, which provide for chromosomal global comparisons and similarity. For similarity estimation we can take the sequences of local extremes distances between first and other local maximums in polynomial photometric curves p 1 and p 2, then compare these sequences or determine the norm of their difference. The other metrics in the space P of photometric chromosomal representations we defined as

$$ d\left( {p_{\text{1}}, p_{\text{2}} } \right) = min\left\{ {\int {\left| {p_1 (x) - ap_2 (x) + b} \right|dx:a > 0,\,b\varepsilon R} } \right\}, $$

or simpler, substituting the photometry polynomials with the characteristic functions Cp i , with their values e.g. +1 on the convex and −1 on the concave sub domains; then using just translation and the stretching of either functions we get a pretty robust metrics

$$ d\prime \left( {Cp_{\text{1}}, \,Cp_{\text{2}} } \right) = { \min }\left\{ {\int {\left| {Cp_{\text{1}} (x) - Cp_{\text{2}} \left( {ax} \right) + b} \right|dx:a > 0,\,b\varepsilon R} } \right\}. $$

In this way, we get precise measurements of chromosomal similarity, resulting in chromosome classification, or with some refining we will get individual objects micro detail characterizations. Parameters a and b could be further subjected to some limits. In this way we are performing fuzzyfications on the introduced sorts of metrics, with different levels of fuzzyfication determining degrees of metrics refinements or space granulation, needed for various approaches of similarity, finer for microstructure examination, rougher for global chromosomal comparison and classification. Often, chromosomes are less or more bent. Obviously, chromosomal curvature can significantly distort the metrics results. Thus, instead of flat longitudinal photometric sections, more appropriate is to take photometric sections along longitudinal geodesic lines. That means we need to trace curved coordinate systems along individual chromosomes, following their shape curvature. The equivalent would be to take flat longitudinal sections along normalized—rectified chromosome image. We developed some rectification procedures which include manual, semiautomatic and automatic tracing of coordinate systems on chromosomes with parametric control of degree of normalization—in some cases the complete normalization introduces undesirable degree of local image distortion. Before such procedures are applied, it is necessary to perform automatic contour/shape extraction and reduction of background noise.

In situ hybridization techniques provide for magnifications of features at molecular level to the visually observable level. Using described method enables positional localization of genetic signals with the subpromille precision relative chromosome length in a chromosomal coordinate system. The emerging multiple gene hybridization can be treated similarly to the first chromosome metrics, thus reaching combinatorial invariants of gene distributions on chromosome which would be of serious impact in future diagnostic ramifications. In cases of large numbers of hybridization enhanced molecular features, the counting measures, distributions, clustering and statistic properties replace geometric shape analysis. We developed fractal analysis procedures which can give good dimensional invariants.

Apart from the shape analysis, the investigation of frequency distributions in biological signals can uncover more subtle phenomena, structures and relations which are hardly or completely invisible, being masked by complexity, or, even worse, by the level of noise. The Fourier spectroscopy and its variants (some examples are available in [6]), is the standard method for frequency distribution analysis. It has a long tradition and broad applicability. Fourier spectra usually contain some degree of randomness, often caused by some interference. The method importance comes from presence of some prominent spectral lines which correspond directly to certain physiology, above the noise level in simpler cases, and they and their relative relations are the subject of investigation. There are simple or stationary processes or those which are so conditioned, where the investigated spectral lines are stable over time [79], which is crucial because this stability characterizes the related phenomena. However, in order to establish this property, a sequence of spectra or power spectra is needed. Spectral stability over time with fixed spectral resolution and sufficient time resolution is the basic criterion for spectral semantics significance. The relation of feature time locations and spectral resolution with the corresponding degree of uncertainty are the only key parameters. If we have a sequence of adjacent spectra with dominant lines quickly changeable, then the individual spectra could hardly have a direct reasonable meaning. The fundamental WYSIWYG (what you see is what you get) property of Fourier spectroscopy has to be carefully ascertained. People often expect and believe that the proportion of spectral lines reflects the ratio of amplitudes of related frequency components present in the investigated signal. Not seldom, this is not obvious at all. In tale modo, the meaning of individual spectra present in many scientific papers is questionable. The time Fourier spectroscopy, the spectroscopy with time scrolling epoch, models well a variety of non static problems. However, it can depart from WYSIWYG in different ways. As long as the proportions of the amplitudes of frequency components present in the signal are preserved, the semantics is satisfactory with good insight into the relative relations which is enough in most cases. In order to ensure that, we need to recalibrate the observed amplitudes. For a pulse with stable frequency and amplitude, the amplitude of its representation in the Fourier spectrum (e.g. power spectrum) will be proportional to its time base (i.e. time domain) within the spectral epoch, i.e. when the pulse is out of the epoch, its spectral amplitude is zero, increasing to a maximum while the pulse is scrolling into the epoch, then remaining constant as long as the pulse is within the epoch, decreasing and vanishing when the pulse is leaking out of the epoch. In this way, the profile of its spectral line in time takes the shape of Ziggurat; then simple geometry reveals the exact pulse location in time, its amplitude and frequency, thus compressing the uncertainty. When the prominent spectral lines are sufficiently distant to avoid numeric interferences, this provides a good spectral recalibration aiming at WYSIWYG. This method can be applied for approximate WYSIWYG recalibration of spectra with relatively slow frequency/amplitude dynamics. Tuning spectroscopic resolution and sampling rate can help to some extent, but generally Fourier spectroscopy, with whatever sets of basic functions, uses constant coefficients.

In time spectra, where the coefficients in Fourier developments merge to time functions, we can have less or more organized structurality. Fast changes with some degree of randomness we can model similarly to previously discussed CCD photometry cases. There are many cases where time spectra major features are geometric time/frequency/intensity structures. They reveal important control mechanisms and other subtle details. If these features are well distinguishable against the time spectra background, we can apply algorithms for feature contour definition, separation and extraction with suitable normalizations, similarity pattern matching, with classification or recognition and analysis of feature geometric properties, as discussed in photometry examples, with automatic procedures whenever applicable, with semi automatic or manual, when fuzziness or uncertainty becomes significant. Among the most exciting examples are the patterns currently used in BCI applications; initial complete solutions available in [10, 11]. Mostly, those are the patterns of brain signals related to imagined motor activity and are suitable for automatic treatment. These applications are successfully developed in various brain research facilities with growing collection of this type of patterns. The high frequency bases for BCI are presented in [2, 1215].

There are more complex cases, where the aiming patterns are partly or completely masked, invisible, not just in the original signals, but in the time spectra as well. For example the brain patterns related to music tones and music of external origin are in this category (early works in [2, 12, 16, 17], more recent in [13, 18, 19], exciting success in [20]). We are involved in the recognition of inner music—imagined tones and music, and BCI based on inner tones and music, where the masking effects are so pronounced that the traces of inner tones are often totally imperceptible in the power spectra. Hence, we have a problem: how to detect the structures in the power spectra which are embedded in the noise level? These structures can be so elusive that they avoid detection by statistic causality tests. There are some rather simple ways that can be useful in such situations. The local linear dependence and spectral feature combing. For the two signals s 1 and s 2, with their power spectra p(s 1) and p(s 2), we say that they are linearly dependent in (coordinates) i 1, i 2,…,i n if in their coordinate vise (dot) product the product is non zero in each of these coordinates. Again, the relative ratio is what counts. When we have a number of brain—EEG signals, e 1, e 2,…, e n , all of which contain a tiny frequency component, then, since a majority of brain activity is rather local, the dot product p(e 1)*p(e 2)*…*p(e n ) will relatively enhance linearly dependent coordinates and let the tiny component bearing a music tone or a pattern surface out of noise and still avoid other energetic lines which mainly belong to lower frequency band. Obviously, the same holds for time power spectra. The other, spectral combing method extends the above by extending the product Π p(e i ) with one more member—power spectrum of t j ε T—the set of calibration tones, constant in time, acting as spectral filters corresponding to the tones in the calibration set T. Picking the

$$ max\left\{ {\left( {\Pi p\left( {e_i } \right)} \right)*\,p\left( {t_j } \right)\left| {i \le n\& t_j \varepsilon {\text{T}}} \right.} \right\}, $$

we have a good chance for the extraction of the present tone out of noise. However, the individual inner tones are imagined, generated by one individual and as in loud singing, we have a situation with imprecisely tuned tones which have to be extracted and identified. Thus, instead of individual precise tones (with their harmonics), more successful is their controlled domain fuzzyfication, i.e. each frequency line in the tonal power spectrum p(t j ) we substitute by a narrow spectral band of controlled width, with desirable distribution, the simplest being normalized with value 1. By desirable distribution in this context we mean certain types of fuzzy sets that can enhance extraction of the required features. So, Fε(p(t j )) would be an ε fuzzy neighborhood (fuzzy representation) of the individual tone t j ε T, leading to the fuzzification of extraction procedure, thus calculating

$$ max\left\{ {\left( {\Pi p\left( {e_i } \right)} \right)*\,F\varepsilon p\left( {t_j } \right)\left| {i \le n\& t_j \varepsilon {\text{T}}} \right.} \right\}, $$

rather then the former product.

3 Method Illustrating Examples

In this section we use examples to illustrate the earlier discussed methods. We start with the shape analysis. In Fig. 1 we have recordings of two signals from individual Purkinje neurons in the brains of two rats, with high sampling rate at 30 KHz. The top one was given some pharmacologic substance, while the lower did not receive anything. The presented images show different aspects of brain patterns under different magnifications. The lower animal has well organized packs and equidistant voids, which is highly disturbed with the animal which received a drug. The packs are irregular, irregularly spaced, voids containing many more individual firings. The highest magnification provides fine details in the single simplex and complex pulses.

Figure 1
figure 1

Signals from two rat brains individual Purkinje neurons, acquired at 30 KHz. Top animal signal showing disturbances due to the administrated drug used in the treatment of heart problems.

In the Fig. 2, lower, with flat chromosomes, on the left we have a CCD image of a mitosis, with a couple of homologue chromosomes in the lower right corner, shown with their photometric longitudinal sections, the right one having one local extra extreme, corresponding to one extra absorption zone, exhibiting chromosomal trizomy syndrome. In the middle and to the right is shown a couple of mice chromosomes with their different longitudinal photometric sections and similarity matching using a sort of metrics similar to one defined earlier.

Figure 2
figure 2

The extracted chromosome photometry demonstrates a trizomy.

Automatic chromosome contour definition, tracing of the curved coordinate system, its normalization-rectification and sorting is shown in Fig. 3.

Figure 3
figure 3

Automatic cariotype, with algorithms for contour definition, object extraction, tracing of curved coordinate system, image transformation using the coordinate system normalization and sorting.

A mitosis with one larger irregular chromosome next to the largest—chromosome 1, is shown in Fig. 4, left. In the middle we have these chromosomes normalized with their photometric section shapes, in the inspection of partial structural similarity of the irregular chromosome whose genetic origin is investigated using accented features in the two dimensional photometric representations. On the right, a geometric equation, the top—normalized irregular chromosome is structurally identical to the structural fusion of the lower manifolds, left the missing Y chromosome, here borrowed from other mitosis, and the large part of the chromosome one. The shapes are similar according to our definitions, equal in length, thus geometrically congruent.

Figure 4
figure 4

Geometric morphology equations on previously normalized chromosomes, used in genetic chromosome backtracking.

The Fig. 5 depicts FISH, fluorescent in situ hybridized mitosis, with bright spots marking a gene signal. Normalized chromosomes with gene signals with their photometry and similarity comparison assert that the gene positions in both chromosomes are practically identical, allowing precise addressing of gene location in chromosome coordinate system.

Figure 5
figure 5

Precise addressing of a gene in a chromosomal coordinate system, after chromosome normalization.

A mitosis with FISH gene signals, multiple signals on the extracted and normalized chromosomes are shown in the Fig. 6.

Figure 6
figure 6

FISH signals on chromosomes provide for gene identification and localization in the cariotype. The presence of multiple signals opens the chance to identify the morphometric invariants, which could be important in the future differential diagnostics.

Numerous hybridized RNA molecules in a neuronal cell nucleus are measured counting local extremes of photometric absorption. Different granularity refinements result in different counts of local extremes corresponding to the dark spots, shown left in Fig. 7. A microscopic CCD image of cell nuclei in the center does not show details, center. However, if inverted negative, we notice gene signals. In the two dimensional photometry, we can identify the nuclei well, with the largest central nu8cleus containing three gene signals identifying trizomy syndrome.

Figure 7
figure 7

On the left, a hybridized neuronal nucleus with dark dots counts using different filters for the separation of local extremes. Center: a CCD image of a cluster of cell nuclei not showing hybridized genes. Right: the simple operations on digital images provide visualization of originally invisible structures.

The samples with FISH preparations taken with different wavelength filters in ultraviolet light is shown in the Fig. 8, left and center, exhibiting very poor contours. After recentering and some preprocessing, image fusions in RGB or monochromes are made as linear combinations of starting images, shown to the right. In this way we obtain transcolor predicates or properties, as a color gallery, thus in one dimension higher. This uncovers relations that might have been hidden in the original shots (with fixed color balance), like signal to contour relations. If necessary, prior to fusion we can perform orthogonalisation of the input images.

Figure 8
figure 8

Left and center: Two out of three poor detail CCD shots of preparations with hybridized cell nuclei. Right: linear combinations provide originally invisible structurality.

Details and properties in signals which are harder or impossible to percept visually are analyzed with Fourier spectroscopy. Spectra with acoustic examples are shown in the Fig. 9. They are basically WYSIWYG because the sampling to resolution ratio is well adjusted, amplitudes and frequencies of spectral features constant enough.

Figure 9
figure 9

Acoustic spectrograms. Left: a song of Balena; center: Bach—the signature theme; right: Zaratustra beginning. All basically WYSIWYG.

In Fig. 10 we have prominent spectral lines of brain signals with acoustic stimulation, with their time profile concatenated, in the top right window in the left image, the Ziggurat like shapes, which are used for the subsequent spectral recalibration to WYSIWYG. The screen shot on the right shows acoustic signal with diminishing intensity, time spectrum and profiles of spectral lines, top right, exhibiting distorted Ziggurat shapes.

Figure 10
figure 10

Brain signal spectroscopy. Left: the Ziggurat like profiles of spectral lines. Right: a diminishing tone spectral line time profile, with still rather visible Ziggurat appearance.

Some examples of rat arterial blood pressure (BP) time spectra variability controlled by CNS, the central nervous system, with different spectral features corresponding to rats exposures to diverse drugs, is shown in the Fig. 11, top left. In the top right, we have BP signals of two rats with their spectra. Both were exposed to a heavy hemorrhage, the first received the survival supportive Fizostigmine, the second was not given anything. In the second row we have accumulated time power spectra with two hour time base, depicting the whole experiment. Left image shows a dominant structure, which is present in regular BP spectra of the first animal, with some periodic variability corresponding to some slow negative feedback control. On the right image the power spectra of both animals are shown. The left time power spectra belong to the animal which received Fizostigmin, the other is to the right. The first animal whose spectrogram is on the left survived experiment. Its spectrogram is basically regular. The other animal died at the end of this experiment. Its spectrogram is essentially different, having a low frequency feature activated at cca 40 min after the beginning and vanishing some 20 min before the end of experiment and its life. The mid frequency major feature which is obvious in the regular spectrograms, the baroreflex line is completely missing. In the final 20 min of this experiment, we have emergence from a random dot cloud, of the tilted feature close to the position of the central dominant feature in the regular spectrogram. The switching of spectral features is witnessing the switching of two different processes in the CNS system control. Here the tools devised for the earlier shown photometry investigations are used to extract features from spectra, to normalize them, analyze structural details and for similarity comparisons. In this way different controls, functional modules in CNS, which are active in the BP regulation, are identified, giving rise to the possibility of investigation in such circumstances of functions involved in death and death announcement.

Figure 11
figure 11

BP analysis. Top left: examples of time spectral structures identified with administration of diverse pharmacologic substances. Analysis of spectrograms of hemorrhaged rats, top right and lower raw. The 2 h spectrograms of both animals is at lower right. Lower left: a baroreflex line in the surviving animal with very slow variability.

Two examples of experiments with EEG signals are presented next. The works on BCI were in development in 90’s and early 00’s, reaching full affirmation and explosive implementation in recent years. One general approach which had general success and is dominant in the current BCI is based on the identification and real time recognition and classification of the will generated features in low frequency EEG, mainly corresponding to the imagined motor actions, the activation commands for those actions. In Fig. 12, we have unfiltered signals, with good response, from four experiments of will controlled physiology in the EEG low frequency bands. The signals from each experiment are placed in one colon of the initial signal matrix. In the second image first raw, we have signals with their power spectra and low frequency extractions of the power spectra. The time composite spectra obtained as the dot products of power spectra of each experiment with time propagation are shown in the second raw left. We can see in the delta and theta region (≤6 Hz) nice frequency/time features, corresponding to the two different will controlled physiology states. The top right corner of this image is expanded on the right image in the second raw and shows a will generated feature it the 10–12 Hz band. Even with a few of controllable will generated states which are well distinguished, one can integrate a successful BCI, as is the case with major contemporary BCIs.

Figure 12
figure 12

Experiments with controlled brain physiology. Well responding channels, unfiltered signals from experiments—colons, signals, spectra, spectral initial extracts, top right. Lower raw left: composite time spectra enhancing spectral features present in all signals, removing those which are not, in delta and theta, second, third and forth quadrant, the first has a structural feature in 10–12 Hz band. The same structure shown separate on the right image.

The example of unfiltered signals containing inner tones and music is shown in the Fig. 13. Here we have some of a large number of short experiments lasting 5 s each where experimental subjects imagined calibrated tones which were played shortly before the registration started. Applying the partial linear dependence method on the time spectra of sequences of signals of well responding electrodes, the imagined inner tones were unmasked with variable, but distinguished success. Here are shown some of rather well localized traces of the imagined tones. In the top raw in Fig. 13, we have the real time composite spectrograms obtained in the similar way as in the previous example. Some of the dominant features correspond to the imagined tones. In this way the inner tones are well distinguishable in their frequency neighborhood, although, the majority of them were embedded in the spectral noise. In the second raw left, we have a fuzzy combed composite spectrogram for a tone c2 in its 50 Hz neighborhood, (with one preceding intervention—local boxing of power spectrum), reaching the composite amplitude maximum 1, while to the right, we have initial part of composite spectrogram with the energy concentration, maximum reaching magnitude 16. The energy distributed in this part of the composite spectrum is approximately three hundred times larger than at the imagined tone. But they are well frequency separated which makes the imagined tones distinguishable, surfacing them from the surrounding noise. The further development and reliable automatization will support the BCIs based on inner music.

Figure 13
figure 13

The imagined-inner tones well distinguished traces in their spectral neighborhood in composite spectrograms of EEG channels containing the frequency feature corresponding to the imagined tones, top raw. Lower left, an inner tone well extracted by combing. Right: energetic low frequency band.

4 Conclusions and Discussion

Structural features in diverse biological signals are investigated, with the attention focused on the directly perceptible geometric and random features, illustrated with a number of examples involving rather simple mathematical methods and implementations, thus, leading to the essential mathematical representations and invariants needed for precise treatment of similarity, classification and fine structural analysis, with a number of algorithms improving and making more meaningful the involved solutions. Some of the original processes and phenomena are electro physiological, some are of mechanic physiology and some are microscopic preparations, thus some are transformed to electric form before recording. Basic properties of spectroscopic signal analysis are investigated with some method correction and enhancement, which are needed for better and more precise insight in the signal components of increased interest. Time spectroscopy with some linear algebra useful in the problems encountered in the analysis of spectroscopic time features is presented with the similar set of algorithms, as those which were useful in the geometric and statistic models applied to the former, directly visually observable signal features. The structures present in some important spectroscopic treatment are discussed; especially important are the blood pressure CNS controls and the brain patterns involved in different BCI models. The inner music investigations are approaching the cognitive brain faculties. The basic cognitive content in case of imagined music is not really away of its spectroscopic representation and the inner music is one of the first cases of cognitive contents nearly directly observed. A piece of music we can view as a time sequence of individual harmonies (h 1, h 2, … , h n ), where in many cases, h i , in, are well identified by trigonometric polynomials consisting of pulses—frequency components of each h i . As such, it has immanent geometric nature—the shapes of individual tones in the recalibrated time spectrum. Its algebraic correlate is a discrete syntactic structure respecting certain algebraic or formal laws—the composition rules. Thus we have two music structural representations, spectral or geometric, similar to the note system representation and the other, algebraic or symbolic, which are easily convertible-interchangeable. In this way the recognition of spectral tonal and music representation in brain patterns as present in electrophysiology leads to the symbolic content identification which is parallely processed by a vector of cognitive centers. Automatization of the described implementation is developed with intelligent control of the system, with fuzzy modules as one described above, but with fuzzyfication at the intelligent level too, within syntax processing in the automatized reasoning modules. Further developments and method refinements would bring us closer to the investigation of brain patterns related to other cognitive faculties, speech and language processing, one day mathematics in the brain, where we have to deal with even more elusive, the discrete information bearing structures, DIBS and similar problems as above discussed, structural similarity/metrics, structural pattern matching, classification and recognition [21].

More on the success in BCI is available in [2, 3, 16, 17]. We are investigating applicability of traditional linear and nonlinear methods in some of the current problems such as brain injury [2225], monitoring the depth of anaesthesia [26] and other applications in biomedicine [27]. In cases of less pronounced features the nonlinear analysis is providing additional insights.

Concerning relevant statistical method verification, we can say that a number of mentioned algorithms outcomes depend on the preprocessing stage. In case of chromosomal metrics, if the input chromosomes are well prepared, with clear banding patterns and if well dispersed on the preparations without exceeding chromosomal curvatures which normalized introduce loss of information, without overlapping chromosomes, where it is hard to make distribution of pixel intensity to the overlapping chromosomes, the comparison metrics are well working with very high, close to 100% statistic verification. Else, if some of the mentioned properties are severely violated, or if chromosome banding contrasts or image homogeneity are poor, like in cases of severe hematology or oncology syndromes, with often multiple malformed chromosomes non existing in normal mitosis, the metrics are stable, but the comparison standards are nonexistent or loose, then the outcome easily becomes unpredictable. Therefore, the application needs supervision. Similarly, with spectral features, if they are stable and well distinguished from their spectral environment, the method is highly reliable and in broad use by large populations of researchers. Estimated relative errors in the measurements of morphological parameters of objects detected in processed images, and relative errors of the values of parameters obtained from the FFT transformation are less than 1.5% in the most standardized measurements. There, estimated relative errors in determining the similarity of detected objects from images or processed signals are in less than 2%. Obtained values of significance levels were less than 0.03 in the evaluation of hypotheses of the reproducibility of experiments and in the evaluation of hypotheses in determining causalities in analysis shown in the Figs. 10, 11, 12, and 13. Consequently, the standardized experiments like shown pharmacologic examples are treated with very high stability, while with the inner tones we are happy when we detect their traces. Recent Grierson success with inner tones is reported to have 80% guessing.