Keywords

Key Facts of Urinalysis

  • Among other characteristics, an ideal biomarker should be easily accessible and noninvasive, which could be achieved with urinalysis.

  • Although the 24-h urine evaluation is the gold standard for most quantitative parameters, the random spot urinalysis may give a result that correlates with this technique over a wide range of measurements.

  • Random spot sample repeated measurements can be easily obtained to ascertain disease activity and response to treatment. However, the results can be seriously affected by several variables that depend on the patient and the technique itself, so these must be regularly confirmed.

  • Several formulae and methods have successfully been developed to improve acuity of measurements in a random spot urine sample.

  • Currently, guidelines already recommend protein or albumin/creatinine ratio in a random spot evaluation as a first option to monitor diseases like chronic kidney disease and lupus nephritis.

  • Despite these recommendations, evidence on reliability is not unanimous in all diseases (e.g., lupus nephritis), so it should be applied carefully, especially if therapeutic changes are about to be made.

Definitions

Biomarker

The Oxford dictionary describes a biomarker as “a measurable substance in an organism whose presence is indicative of some phenomenon such as disease, infection, or environmental exposure.”

Chronic kidney disease

Kidney damage or an estimated glomerular filtration rate below 60 mL/min/1.73 m2 persisting for 3 months or more, irrespective of the cause.

Diabetes

Group of metabolic diseases characterized by hyperglycemia resulting from defects in insulin secretion, insulin action, or both. Chronic hyperglycemia is associated with long-term damage, dysfunction, and failure of various organs, especially the eyes, kidneys, nerves, heart, and blood vessels.

Dipstick

A thin, plastic stick with strips of chemicals on it, which is placed in urine to detect abnormalities. The chemical strips change color if certain substances are present or if their levels are above normal.

Glomerulonephritis

Disease that primarily affects the glomerulus. It has clinical presentations that vary from the asymptomatic individual who is found to have hypertension, edema, hematuria, or proteinuria at a routine medical checkup to a patient who has fulminant disease, with acute kidney injury, possibly associated with life-threatening extrarenal disease.

Lupus nephritis

Immune complex glomerulonephritis that is a common and serious feature of systemic lupus erythematosus.

Mercaptoethane sulfonate sodium (MESNA)

A sulfhydryl compound that is used to reduce the incidence of hemorrhagic cystitis associated with certain chemotherapeutic agents, like cyclophosphamide.

Random spot urine

Occasional, one-off, urine collection.

Systemic lupus erythematosus

Multisystem, autoimmune disease, whose diagnosis demands a combination of clinical and laboratory criteria defined by the American College of Rheumatology.

Sodium/glucose cotransporter 2 (SLGT2)

Protein encoded by the SLC5A2 gene and member of the sodium glucose cotransporter family involved in glucose reabsorption in the kidney.

Solute carrier family 5 (SLC5A2)

Gene that encodes the SGLT2 protein.

Urocrit

Similarly to the hematocrit word, it has been used to define the proportion of the urine that is composed of red blood cells. It is measured by centrifugation technique and is used when erythrocyturia is too high to count by microscopy.

Introduction

Urinalysis has historically been defined as a test to evaluate kidney and urinary tract disease. However, during the last few decades, several additional urinary biomarkers have been correlated to other organ or system abnormalities other than kidney disease .

It is well known that an ideal biomarker should be easily accessible and noninvasive, among other characteristics. This profile perfectly matches a random spot urine marker because patients with nephropathy who often progress to end-stage renal disease (ESRD) and depend on hemodialysis treatment will require a vascular access (ideally, arteriovenous fistula or graft) whose correct function depends on good vessel conditions, for which venous punctures to collect blood samples are harmful and should be avoided. In this setting, abnormal findings on a urinalysis may be the first evidence of underlying kidney disease and may be a suitable test for monitoring several conditions, thereby minimizing the number of blood tests.

Twenty-four-hour urine samples used to be the only reliable samples for most of the quantitative analysis. Nevertheless, collecting them is laborious and unfeasible for many population groups, like young working adults with hard schedules and multiple workplaces, elderly people with memory problems and urinary incontinence, as well as children. In these cases, random spot urine assessment has been successfully used. Although this method may be limited in some quantitative parameters and more vulnerable to factors related to the patient’s lifestyle, some correcting formulae have been developed to achieve quantitative results that are highly correlated with timed specimens. This finding supports some current guidelines that already recommend random spot urinalysis instead of the 24-h specimen for screening, diagnosis, and monitoring of several diseases like diabetes or systemic lupus erythematosus (SLE) (Engelgau et al. 2000; Hahn et al. 2012).

Random spot urine markers can be organized into three groups, according to their laboratory method assessment: physical, chemical, and microscopic markers. Physical markers include color, odor, clarity, and specific gravity (SG). Chemical examination includes the identification of protein, blood, glucose, pH, bilirubin, urobilinogen, ketones, nitrites, and leukocyte esterase (LE). Finally, microscopic evaluation entails the detection of crystals, cells, casts, and organisms. Chemical results are the most often widely used, probably because of their (semi)quantitative presentation and possible correlation with quantitative timed collections, but physical and microscopic ones may play a key role in many diagnoses and monitoring.

This review outlines random spot urine biomarkers for the kidney and their applications in both research and clinical practice.

The Specimen for Analysis

Urine collection requires standard handling guidelines to minimize unwanted sources of variability that could compromise the acuity of test results. Good practice prevents potential sources of error; thus, patient preparation and clear information is essential for a correct collection (Strasinger and Di Lorenzo 2014). Both the suitability of a specimen and the rejection criteria must be determined. Following collection, specimens should be delivered and tested within 2 h because changes in urine composition start immediately after the voiding (Strasinger and Di Lorenzo 2014). Bacteria may proliferate and alter pH, casts may dissolve, and crystals may be lost. Refrigeration may cause precipitation of orange red crystals of uric acid, which can be redissolved by rewarming the urine (Strasinger and Di Lorenzo 2014). Additionally, urine analysis can be seriously affected by other variables that depend on the patient, the techniques, and the medical team.

Although this type of specimen is the easiest and most convenient for the patient because it can be collected at any time, some basic recommendations should be followed. Vigorous physical exercise should be avoided for at least 24 h before the collection to minimize exercise-induced proteinuria, hematuria , or cylindruria (Johnson et al. 2014). Women should avoid collection during menstruation because of blood contamination. A midstream clean-catch specimen provides a suitable specimen for routine urinalysis and bacterial culture because it is less contaminated by epithelial cells and bacteria and, therefore, is more representative of the actual urine (Strasinger and Di Lorenzo 2014).

Regarding collection schedule, the first morning void is the most ideal to prevent false-negative pregnancy tests and to evaluate orthostatic proteinuria , because it is a more concentrated specimen, thereby assuring the detection of lower levels of substances otherwise missed in a more diluted specimen (Strasinger and Di Lorenzo 2014). On the other hand, lysis of cells and casts may occur in the bladder overnight, which may lead to false-negative results. In these cases, a second morning urine is more appropriate.

Finally, before reviewing each biomarker analysis and applications, it should be remembered that urine normally contains 95 % water and 5 % solutes. As stated above, considerable variations in these solute concentrations can occur in healthy people owing to the influence of normal biological factors, but they could also be markers of pathogenic processes or pharmacological responses to a therapeutic intervention.

Physical Markers

All routine urinalysis should begin with a physical examination of the sample which includes description of color, odor, clarity, and SG.

Color

Normal urine ranges from pale to dark yellow, which is mainly due to the urochrome pigment, a product of endogenous metabolism, but also to other chemical concentrations, and pH. Changes in color can be caused by physical activity, food, drugs, or pathological conditions. The most frequent pathological conditions that can cause color changes of the urine are gross hematuria, hemoglobinuria, or myoglobinuria (pink, red, brown, or black); bilirubinuria (dark yellow to brown); and urinary infections (dark yellow, white, green, or even purple, according to germs). Less frequent causes include uric acid crystalluria (pink) and porphyrinuria and alkaptonuria (red, turning black on standing). The main drugs responsible for abnormal urine color are rifampin, phenazopyridine, and phenindione (yellow orange); desferrioxamine (pinkish); phenytoin (red); chloroquine and nitrofurantoin (brown); amitriptyline, triamterene, propofol, and blue dyes of enteral feeds (green); methylene blue (blue); and metronidazole, methyldopa, phenol derivatives, argyrol, and imipenem–cilastatin (darkening on standing). Some examples among food include beetroot (red), senna and rhubarb (yellow to brown or red), and carotene (brown) (Strasinger and Di Lorenzo 2014; Johnson et al. 2014).

Odor

A change in urine odor may be caused by the ingestion of some food, urinary tract infection (UTI), maple syrup urine disease, phenylketonuria, isovaleric academia, and hypermethioninemia.

Turbidity or Clarity

The transparency or turbidity of a urine specimen is determined by visually examining the mixed specimen in a clear container while holding it in front of a light source. Freshly voided normal urine is usually clear, particularly if it is a midstream clean-catch specimen.

Urine turbidity can have multiple causes. The most frequent non-pathological causes include squamous epithelial cells, mucous, amorphous phosphates, carbonates, urates, semen, fecal contamination, radiographical contrast media, talcum powder, and vaginal creams. On the other hand, the most common pathological causes are red blood cells (RBCs), white blood cells (WBCs), bacteria, non-squamous epithelial cells, yeast, abnormal crystals, lymph fluid, and lipids.

Clear urine is not always normal; thus, chemical analysis will increase the acuity of physical markers in detecting certain abnormalities (Strasinger and Di Lorenzo 2014).

Relative Density

The evaluation of urine concentration is made by measuring the SG, which correlates with urine osmolality, rising by approximately 0.001 for every 35–40 mosmol/kg increase in osmolality. Thus, a urine osmolality of 280 mosmol/kg (isosmotic to plasma) is usually associated with a urine SG of 1.008 or 1.009 (Wald 2014). It refers to the weight of a urine volume compared with the weight of the same volume of distilled water and depends on the mass and number of the dissolved particles. It can be evaluated by dipstick, which provides a rapid semiquantitative result, as well as by other urinary markers on a series of test pads embedded on a reagent strip (Wald 2014). This method can be influenced by urine pH and non-ionized molecules: underestimation occurs with pH above 6.5, whereas overestimation is found with urine protein concentration above 7.0 g/L (Assadi and Fornell 1986). Additionally, non-ionized molecules, such as glucose and urea, are not detected by the dipstick, so this method may not strictly correlate with the results obtained by refractometry and osmolality (Siegrist et al. 1993). Refractometry measures all solutes, rather than just ionic substances, and is therefore more accurate than dipstick. Temperature-compensated equipment eliminates the influence of temperature, and therefore it is recommended for use in everyday practice (Strasinger and Di Lorenzo 2014). Nevertheless, this method can be affected by protein, glucose, mannitol, dextrans, diuretics, radiographical contrast media, and some antibiotics (Siegrist et al. 1993). Osmolality is measured by an osmometer and depends only on the number of particles present; it is not influenced by urine temperature or molecule size.

Regarding clinical application, SG gives important insight into the patient’s hydration status and the concentrating ability of the kidneys (Simerville et al. 2005). It can be used to crudely estimate how the concentration of other urine constituents may reflect total excretion of those constituents (Jung 1991) because SG correlates inversely with a 24-h urine volume (McCormack et al. 1991). An important additional function of this parameter is to determine whether specimen concentration is adequate to ensure the accuracy of chemical tests because values that are extremely outside the normal range may induce false results. Most random specimens fall between 1.015 and 1.030. A SG of 1.000 to 1.003 is consistent with marked urinary dilution, as observed in patients with diabetes insipidus or water intoxication. Specific gravity of 1.010 is often called isosthenuric urine because it is similar to plasma, so it is often observed in conditions in which urinary concentration is impaired, such as acute tubular necrosis (ATN) and chronic kidney disease (CKD). Specific gravity above 1.040 almost always indicates the presence of some extrinsic osmotic agent, such as radiocontrast. Indeed, self-monitoring this parameter may be useful for stone-forming patients, who benefit from maintaining dilute urine (Siegrist et al. 1993).

In conclusion, dipstick is the least accurate method, so it should be replaced by refractometry if a precise measurement is needed or error interference factors are suspected. Osmolality is more reliable than both methods, and most clinical decisions should be based on this determination when evaluating pathological urine.

Chemical Markers

Chemical urinary markers are probably most useful in daily practice, because their value in the diagnosis, monitoring, and prognosis of glomerular diseases has long been proven. This subgroup also contains multiple methods, discussed below, to measure each type of marker.

Reagent strips currently provide a simple, affordable, and rapid evaluation. As written above, they consist of absorbent pads impregnated with chemicals and attached to a plastic strip. A color-producing chemical reaction takes place when the absorbent pad comes in contact with urine (Fig. 1). The timing for reactions to take place varies between tests and manufacturers and ranges from an immediate reaction for pH to 120 s for LE. The results are interpreted by comparing the color within a chart supplied by the manufacturer. A semiquantitative value of trace, 1+, 2+, 3+, or 4+, can be reported, and a corresponding estimation of the milligrams per deciliter is available for appropriate testing areas (Strasinger and Di Lorenzo 2014). Improved specificity and sensitivity of the most recent dipsticks, as well as the use of automated strip readers, have reduced the need for routine use of confirmatory tests of this method.

Fig. 1
figure 1

Dipstick image. The first strip (A) shows a regular pad before being dipped in urine. The second strip (B) has been dipped in a random spot urine sample of a patient with glomerulonephritis, detecting trace hematuria (a), 3+ proteinuria (b), and 2+ leukocyturia (c)

Blood

Blood may be present in the urine, either in the form of intact RBCs (hematuria) or as the product of RBC destruction, hemoglobin (hemoglobinuria). Therefore, chemical tests for hemoglobin provide the most accurate means of determining the presence of blood.

Hematuria can be measured quantitatively by any of the following: indirect examination by dipstick, determination of RBC number through chamber count, and direct examination of urinary sediment .

The dipstick method is based on the pseudoperoxidase activity of the heme, which catalyzes the reaction of peroxide and a chromogen to form a colored product (green to blue). Red blood cells, hemoglobin, and myoglobin will catalyze this reaction, so a positive test result may indicate hematuria, hemoglobinuria, or myoglobinuria, respectively (Brenner et al. 2011). This latest condition may be suspected when a red-brown urine is seen. Dipsticks can detect concentrations as low as five RBCs per microliter. The terms trace, small, moderate, and large or trace, 1+, 2+, and 3+ are used for reporting (Strasinger and Di Lorenzo 2014). A positive test with green spots mostly results from intact erythrocytes. On the other hand, a homogeneous, diffuse green pattern can result from three main situations: marked hematuria with a high number of erythrocytes that cover the whole pad surface; lysis of erythrocytes favored by delayed examination, alkaline urine pH, or low SG, as well as hemoglobinuria secondary to intravascular hemolysis. False-negative results are mainly caused by ascorbic acid (reducing agent) (Rauta et al. 2002) and high SG. On the other hand, false-positive results can occur due to myoglobinuria and to a high concentration of bacteria with pseudoperoxidase activity (Enterobacteriaceae, Staphylococci, Streptococci) (Lam 1995). Given the limited specificity of the dipstick method (65–99 % for 2–5 RBCs per high-power microscopic field), an initial positive result should be confirmed by microscopic evaluation (Sutton 1990).

The chamber count method measures the number of RBCs per milliliter and has been suggested to have greater precision and sensitivity than the sediment count (Grossfeld et al. 2001). Although microscopic examination is only a semiquantitative method of determining the degree of hematuria, it is easier to perform, less time-consuming, and more cost-effective than the chamber count. Furthermore, it allows different elements to be distinguished by their appearance, enabling the source of the hematuria and its main etiologies to be pointed out (Johnson et al. 2014). However, both chamber and sediment counts have been shown to correlate with acceptable sensitivity (Grossfeld et al. 2001).

Regarding clinical significance, hematuria is most closely related to disorders of renal or genitourinary origin that can be caused by glomerular, non-glomerular (tubulointerstitial (Ballarin et al. 2011), toxic chemicals, anticoagulant-related nephropathy (Brodsky et al. 2011), renovascular or metabolic abnormalities), and urological (tumors, especially bladder cancer (Muto et al. 2014), trauma, pyelonephritis, or renal calculi) diseases. Glomerular hematuria is typically associated with significant proteinuria (Simerville et al. 2005). However, 20 % of patients with biopsy-proven glomerulonephritis (GN) present hematuria alone (Fassett et al. 1982), among which IgA nephropathy is the most common (Simerville et al. 2005). Hematuria of non-pathological significance is observed following strenuous exercise and during menstruation. In both cases, results of repeated urinalysis after 48–72 h should be negative. On the other hand, hemoglobinuria may result from RBC lysis after urination or intravascular hemolysis (hemolytic anemia, transfusion reactions, severe burns, spider bites and infections). Distinctively, myoglobinuria due to rhabdomyolysis can happen in conditions like strenuous exercise, trauma, prolonged coma, convulsions, muscle-wasting diseases, alcoholism, as well as from statin medication and several kinds of drug abuse.

Besides the multiple conditions stated above, where hematuria has an “injury marker” value, recent information has presented its prognostic value in diseases like IgA nephropathy (Moreno et al. 2012) and acute kidney injury (AKI) (Gutierrez et al. 2007). An association with long-term incidence of ESRD has also been established (“risk factor” value) (Vivante et al. 2011). Despite this evidence, there are pitfalls in quantifying hematuria, rendering it more difficult to objectively assess the impact of therapy and to predict the outcome. Firstly, this is because it is not routinely quantified as accurately as albuminuria and proteinuria are (Moreno et al. 2012), and secondly, the assessment of glomerular hematuria may be interfered with by the presence of the non-glomerular one. Because of the absence of hard data, at the present time, it is difficult to make recommendations on the specific target values of hematuria that should influence treatment decision making.

pH

Urine pH reflects the presence of hydrogen ions, but this does not necessarily reflect the overall acid load in urine because most of the acid is excreted as ammonia (Johnson et al. 2014) and a very small amount as weak organic acids. The normal range for urine pH is 4.5–7.8.

As above, it is usually measured with a reagent test strip. Most commonly, methyl red and bromthymol blue double indicators are used to give a broad range of colors at different pH values (Brenner et al. 2011), covering a range from 5.0 to 9.0. A more precise method, pH meter, measures the concentration of hydrogen ions, whose positive charge creates an electrical potential detected by an electrode.

The importance of urinary pH is primarily as an aid in the differential diagnosis of acid–base disorders. Moreover, this marker can be used to monitor conditions that require urine to be maintained at a specific pH, to prevent drug nephrotoxicity, like that due to precipitation of methotrexate in the renal tubules, or to inhibit renal calculi formation (Sand and Jacobsen 1981). Regarding the second disorder, an alkaline pH favors the crystallization of calcium- and phosphate-containing stones, whereas an acidic pH promotes uric acid or cystine stones (Wagner and Mohebbi 2010). For that reason, pH control is essential to assess treatment response. Urine pH measurement is also mandatory if an accurate crystal microscopic evaluation is necessary.

However, pH by itself provides little useful diagnostic information, and it must be considered in conjunction with other information about the patient because many of the deviations can be explained by non-pathological conditions. A low pH is observed in metabolic acidosis, one of the main causes of which is high-protein meals (Brenner et al. 2011), and hypovolemia, in which aldosterone stimulation increases hydrogen secretion in the distal tubules. Indeed, low urine pH may help distinguish prerenal AKI from ATN, which is typically associated with higher values. Other conditions with alkaline pH are infection by urease-positive organism (Proteus), renal tubular acidosis, vomiting, and gastric suction. Non-pathological causes include vegetarian diet (minimal nitrogen and acid generation), diuretic, and alkali therapy.

Bilirubin

Bilirubin is a highly pigmented yellow compound that results from hemoglobin degradation. Its appearance in urine may provide an early indication of liver disease due to obstructive disease or hepatocellular injury, but not hemolysis, because only conjugated bilirubin is water soluble and renal excreted.

Although reagent test strips are very sensitive to bilirubin, detecting as little as 0.05 mg/dL, this method is not very sensitive for detecting hepatic abnormalities. In addition, measurement errors often occur, firstly, due to false-positive results because of urine contamination with stool and other pigments and, secondly, due to false negatives because of prolonged sample storage and exposure to light, promoting bilirubin photo-oxidation to biliverdin (Young 1990). Consequently, since the introduction of serum tests of liver enzyme function, this measurement has lost its clinical application.

Urobilinogen

Urobilinogen appears in urine because as it circulates in the blood back to the liver, it passes through the kidney and is filtered by the glomerulus. A small amount of urobilinogen (<1 mg/dL) is normally found in urine.

This strip test reaction sensitivity increases with temperature (ideally room temperature). False-positive reactions may occur due to porphobilinogen, p-aminosalicylic acid, sulfonamides, methyldopa, procaine, and chlorpromazine compounds, among others. False-negative results occur most frequently when specimens are improperly preserved (photo-oxidation of urobilin) or formalin is used as a preservative.

Although this parameter also has low clinical application, it can be slightly increased in chronic constipation, and main pathological causes are liver disease and hemolytic disorders (Strasinger and Di Lorenzo 2014). Some authors (Gorchynski et al. 2009) tried to prove its utility for adult blunt abdominal trauma patients, but they found it was a poor predictor for intra-abdominal injury, thus not clinically useful.

Ketones

Ketones represent intermediate products of fat metabolism, namely, acetone (2 %), acetoacetic acid (20 %), and β-hydroxybutyrate (78 %). Normally, measurable amounts of ketones do not appear in urine, because all the metabolized fat is completely broken down into carbon dioxide and water. However, when the use of carbohydrate is compromised, body fat stores are metabolized to supply energy, increasing ketones in the blood, which may lead to dehydration, electrolyte imbalance, acidosis, and coma.

With the exception of β-hydroxybutyrate, they can be detected in urine through the dipstick nitroprusside reaction (Jacobs et al. 1990), where acetoacetic acid reacts with sodium nitroprusside to produce a purple color. Results are reported qualitatively as negative, trace, small (1+), moderate (2+), or large (3+), or semiquantitatively as negative, trace (5 mg/dL), small (15 mg/dL), moderate (40 mg/dL), or large (80–160 mg/dL) (Strasinger and Di Lorenzo 2014).

Positive reactions can occur in prolonged fasting or starvation, strenuous exercise, malabsorption, vomiting, and alcoholic or diabetic ketoacidosis. Published data showed a good correlation between urinary and capillary blood ketones for lower values of glycemia, but not for higher. As a result, this marker can be used to exclude ketosis, but not to confirm ketoacidosis (Taboulet et al. 2007). Therefore, it is valuable in the management and monitoring of type 1 diabetes mellitus because ketonuria is an early indicator of insufficient insulin, indicating the need to upregulate dosage, as well as other concurrent events. False-positive reactions can be due to late readings after long sample standing and the presence of mercaptoethane sulfonate sodium (MESNA), captopril (sulfhydryl groups), levodopa, ascorbic acid, or phenazopyridine, among others. False-negative tests can happen if specimens are not preserved properly, allowing the metabolization and volatilization of ketone compounds.

Nitrites and Leukocyte Esterase

Nitrites result from the conversion of nitrates in the presence of bacteriuria. This marker is highly specific for bacteriuria, but several uropathogens other than Enterobacteriaceae do not reduce nitrate to nitrite (Patel et al. 2005); thus, a negative test does not necessarily mean that the urine is free of bacteria.

Nitrite is detected by the Griess reaction, in which nitrite at an acidic pH reacts with an aromatic amine to form a diazonium compound that then reacts with tetrahydrobenzoquinolin compounds to produce a pink color (positive). Different shades of pink may be produced, but they do not reflect the degree of bacteriuria. Optimal results for a urinary nitrite test are obtained by analyzing an early morning specimen that has been incubating in the bladder for 4 h or more. A clean midstream specimen is important to reduce bacterial contamination.

Apart from the first situation mentioned above, false-negative results may also occur when urine has been in the bladder for a short period, dietary nitrite is absent, nitrate reduction has gone beyond the nitrite stage to form nitrogen (Garingalo-Molina 2000), and bacterial metabolism is inhibited by the presence of antibiotics or large amounts of ascorbic acid (Strasinger and Di Lorenzo 2014). On the other hand, false-positive tests can result from improperly preserved specimens and highly pigmented urine (Strasinger and Di Lorenzo 2014).

Although a semiquantitative culture of a urine specimen is the only method that can provide detailed documentation of a bacterial UTI, it is costly and takes at least 24 h. Thus, a screening test could be carried out using dipstick, where nitrites and LE accuracy have been correlated to culture (Lohr 1991).

The LE test detects esterase, an enzyme released by both intact and lysed WBCs (except lymphocytes), which catalyze the hydrolysis of an acid ester to produce an aromatic purple azo dye. Because it also detects lysed cells, it could be more accurate than microscopic evaluation, but should be quantified using this second method. It should be performed on a fresh specimen and reactions are reported as trace, small, moderate, and large or trace, 1+, 2+, and 3+ (1). In addition to UTI screening, it also detects infections caused by Trichomonas, Chlamydia (Rahman et al. 2014), and inflammation of renal tissues (like interstitial nephritis) that produce leukocyturia without bacteriuria.

False-positive reactions can be due to the presence of strong oxidizing agents, formalin in the container, highly pigmented urine, and nitrofurantoin. False-negative results may occur with high SG urine, antibiotics like gentamicin, cephalexin, cephalothin, and tetracycline, as well as high concentrations of protein (>500 mg/dL), glucose (>3 g/dL), oxalic, and ascorbic acid (Strasinger and Di Lorenzo 2014).

Although several studies have confirmed the low sensitivity and specificity of the dipstick test, these two positive reactions suggestive of UTI can be used to determine empirical treatment, as well as in follow-up (Demilie et al. 2014), because their combination improves each other’s performance (Semeniuk and Church 1999). In contrast, a negative reaction should be an indication for culture if clinical features are present. Even though they are not intended to replace urine culture, they can be cost-effective in reducing the necessity to perform these cultures (Wise et al. 1984). Currently, they are often ordered to screen high-risk patients who are frequently asymptomatic, like pregnant women, children, the elderly, and patients with recurrent UTI, CKD, and diabetes.

Glucose

Glycosuria is the most frequently performed chemical analysis on urine.

One of the oldest methods available is based on copper reduction, where glucose reduces copper sulfate to cuprous oxide in the presence of alkali and heat (Benedict reaction). Color ranges from a negative blue through a positive green to orange/red and should be compared with the manufacturer’s chart to estimate quantitative measurement of glucose. Its sensitivity is reduced to a minimum of 200 mg/dL, but at high glucose levels, a “pass through” phenomenon may occur and a return to a negative color can happen without reporting the previous temporary positive result (Strasinger and Di Lorenzo 2014). In contrast to the causes of false-negative results, this test is not specific for glucose, so other reducing substances, such as sugars including galactose, lactose, fructose, maltose, and pentose, ascorbic acid, certain drug metabolites, containers oxidizing detergents, and antibiotics, like cephalosporins, may induce false-positive reactions (Brigden et al. 1992). Although it has a low sensitivity and specificity to detect glycosuria, this method detects urine galactose; thus, it can be used to screen newborns for “error of metabolism,” in which a lack of the enzyme galactose-1-phosphate uridyl transferase prevents breakdown of ingested galactose (Strasinger and Di Lorenzo 2014).

A more recent method is performed through a dipstick impregnated with a mixture of components, including glucose oxidase, which promotes a double sequential enzyme reaction, since glucose triggers the production of gluconic acid and peroxide, which in turn catalyzes the oxidation of a chromogen to form a colored compound that will vary according to the chromogen used by each manufacturer. Under ideal conditions, the intensity of the color should be directly proportional to glucose concentration, thereby allowing a quantitative estimation (Gray and Millar 1953). Here, interference by other reducing agents does not occur thanks to the additional chemicals in the strip. Results are reported in terms of negative, trace, 1+, 2+, 3+, and 4+, which correspond to quantitative measurements ranging from 100 mg/dL to 2 g/dL, or 0.1 % to 2 %, provided by the manufacturers (Strasinger and Di Lorenzo 2014). Although its sensitivity decreases with high SG and low temperature, the greatest source of false-negative results is the long-standing samples subjecting glucose to bacterial degradation. High ketones also may interfere with the results but only in the presence of low glycosuria levels, which rarely happens.

Glycosuria clinical application occurs when kidney proximal tubules are unable to reabsorb all the filtered glucose, despite normal plasma glucose levels, or urinary leak occurs due to hyperglycemia above 180 mg/dL (10 mmol/L), when tubular reabsorption of glucose has reached its threshold. In the first setting, glycosuria can occur in an isolated setting in ESRD and cystinosis, but usually occurs with Fanconi syndrome (phosphaturia, uricosuria, and aminoaciduria). Possible causes are multiple myeloma, heavy metal exposure, and treatment with certain medications including tenofovir, lamivudine, cisplatin, valproic acid, and aminoglycosides (Haque et al. 2012). In rarer situations, it may also be an isolated defect associated with genetic mutations, as in solute carrier family 5 (SLC5A2) gene encoding sodium/glucose cotransporter 2 (SGLT2), which affects renal glucose transport (Magen et al. 2005). Hyperglycemic scenarios are mostly caused by diabetes, but a minority can be due to pancreatitis, acromegaly, Cushing syndrome, hyperthyroidism, pheochromocytoma, and thyrotoxicosis. Nevertheless, healthy people can have glycosuria caused by a temporary lowering of the renal threshold during pregnancy and after a high glucose content meal. Therefore, collection conditions should be controlled (previous 12-h fasting recommended) and results interpreted according to settings. The use of currently available reagent strip methods for both blood and urine glucose testing has greatly increased early diagnosis of diabetes, allowed patients to monitor themselves at home, and improved outcomes. For purposes of diabetes monitoring, specimens are usually tested 2 h after meals. Additionally, a first morning specimen does not always represent a fasting specimen because glucose from an evening meal may remain in the bladder overnight, so patients should be advised to empty the bladder and collect the second specimen. Overall, fasting glycosuria testing has a specificity of 98 % and a sensitivity of 17 % (Singer et al. 1989) as a screening test for diabetes.

Albuminuria

Albuminuria has been the hallmark used to develop clinical practice and research in the area of diabetic nephropathy, CKD, and cardiovascular disease , among others.

The normal rate of albumin excretion is less than 30 mg/day (20 mcg/min), about 4–7 mg/day (3–5 mcg/min) in healthy young adults, and increases with age and with body weight. Microalbuminuria is currently defined as a persistent albumin excretion of between 30 and 300 mg/day (20–200 mcg/min) or a urinary albumin/creatinine ratio (UACR) of 2.5–35 mg/mmol in male subjects and 3.5–35 mg/mmol in female subjects (KDIGO 2013; Stevens and Levin 2013). Albumin excretion above 300 mg/day (200 mcg/min) is considered macroalbuminuria. Previous reviews and clinical practice guidelines have called for the abandonment of the term “microalbuminuria” (Ruggenenti and Remuzzi 2006). The Kidney Disease: Improving Global Outcomes (KDIGO) 2012 clinical practice guideline for the evaluation and management of CKD discourages the use of the term when classifying patients to a CKD stage according to their level of albuminuria. The suggested approach is to slot albuminuria into three categories: A1, normal to mildly increased (instead of normoalbuminuria); A2, moderately increased (instead of microalbuminuria); and A3, severely increased (instead of macroalbuminuria or proteinuria) (KDIGO 2013).

Transient elevations in the excretion of albumin can be seen in fever, infection, exercise, heart failure, nonspecific joint inflammation, poor glycemic control (hemoglobin A1c >8 %), obesity, and hyperlipidemia (low-density lipoprotein cholesterol >120 mg/dL) (Wald 2014). Posture, smoking, and diet may also influence albumin excretion rate (MacIsaac et al. 2014). However, persistent microalbuminuria has not only been consistently associated with diabetic nephropathy, but also with hypertension (>160/100 mmHg), coronary heart disease, and overall cardiovascular disease, in both diabetic and nondiabetic patients (Newman et al. 2005). As illustrated in the Prevention of Renal and Vascular End-Stage Disease (PREVEND) study (Abdelmalek et al. 2014), it also enhances the predictive value of ST and T wave changes for cardiovascular disease. Concerning kidney disease, it is predictive of developing clinical proteinuria, faster decline in glomerular filtration rate (GFR), and higher risk of ESRD (Newman et al. 2005). Microalbuminuria has generally been accepted as the first injury marker of diabetic nephropathy. One other study showed that cardiovascular risk for patients with type 2 diabetes starts to increase with urinary albumin excretion levels above 1 mcg/min, even before the upper limit of normoalbuminuria is reached (Ruggenenti et al. 2012). In addition to its “injury marker” value, it may also be useful in monitoring treatment. Additionally to this evidence, both Effect of a Multifactorial Intervention on Mortality in Type 2 Diabetes (STENO-2) and Irbesartan in Patients with Type 2 Diabetes and Microalbuminuria (IRMA-2 trials) trials concluded that early intervention with anti-albuminuric drugs and monitoring is cost-effective, leading to amelioration of GFR loss, decrease in ESRD incidence, and prolongation of life. A similar association between those drug effects and other well-accepted surrogate cardiovascular end points, such as blood pressure and cholesterol, has been reported (Roscioni et al. 2014). However, it should be noticed that there is no test capable of distinguishing between microalbuminuria linked to cardiovascular disease and that linked to renal disease (MacIsaac et al. 2014).

Regarding its measurement, although a 24-h urine collection is the gold standard for the detection of microalbuminuria (Bennett et al. 1995), it has been suggested that screening can be more simply achieved through a random spot specimen.

Albumin dipsticks are based on binding antibody to the strip to form a conjugated enzyme, which reacts with the substrate to produce a color range according to pH changes. Colors range from pale green to aqua blue; intensity is proportional to the concentration of albumin itself and is graded on a scale from 0 to 4+ (Lamb et al. 2009), which can be correlated to a quantitative scale supplied by the manufacturers. Although urine concentration may strongly influence this marker result, limited data suggest that correcting this value to urine SG improves ability to identify abnormal results (Constantiner et al. 2005). Other false positives may occur after the use of iodinated radiocontrast agents (Morcos et al. 1992), with a highly alkaline urine (pH >8) (Simerville et al. 2005), with gross hematuria or a urocrit >1 % (Tapp and Copley 1988), and when specific antiseptics (chlorhexidine, benzalkonium) are used before collection (Magen et al. 2005). Abnormally high values should always be confirmed by repeated measurements (KDIGO 2013), and if a true positive is confirmed, accurate quantification by another method is needed. On the other hand, other factors unrelated to low SG may be associated with false-negative rates, as some albumin components are not immunologically reactive with the strip component (Busby and Bakris 2004).

Overall, albumin dipstick test is a very specific but not sensitive method, because many patients with microalbuminuria may be missed unless their urine is highly concentrated. Still, more reliable strips have been designed and been successful in detecting moderately increased albuminuria with a sensitivity and specificity of 80–97 % and 33–80 %, respectively (Comper and Osicka 2005).

In addition to this method, the high-performance liquid chromatography technique is able to assess all intact urinary albumin, even non-immunoreactive albumin, thereby increasing sensitivity and specificity (Busby and Bakris 2004) and proving itself useful for earlier detection. Other types of reagent strip provide simultaneous measurement of albumin (or protein) and creatinine, thereby allowing the comparison between their excretion and producing a semiquantitative UACR, which corresponds to a more accurate estimation of the 24-h excretion.

Conventional reagent pads detect a minimum of 30 mg/dL and may include other proteins besides albumin. Strips using dye bis(3′,3″-diiodo-4′,4″-dihydroxy-5′,5″-dinitrophenyl)-3,4,5,6-tetrabromo-sulfonephthalein (DIDNTB) can measure albumin as low as 8 mg/dL, without the inclusion of other proteins, increasing overall sensitivity and specificity. Reaction interference by highly alkaline urine can be also controlled with bis-(heptapropylene glycol) carbonate reagent (Strasinger and Di Lorenzo 2014).

As stated before, limitations regarding the 24-h collection adequacy are not negligible, so this vital step can be checked by quantifying the 24-h urine creatinine and comparing this value to the expected one. As a general rule, in adults under the age of 50, daily creatinine excretion should be 20–25 mg/kg (177–221 micromol/kg) of lean body weight in men and 15–20 mg/kg (133–177 micromol/kg) of lean body weight in women, figures which progressively decline to 50 % in their 90s. Due to these limitations and the need for laborious control, some alternatives have been proposed. Most commonly, random spot UACR is used to estimate a 24-h proteinuria and to follow the effects of treatment in patients with cardiovascular and albuminuric kidney diseases.

Reagent strips containing copper sulfate (CuSO4), 3,3′,5,5′-tetramethylbenzidine (TMB), and diisopropylbenzene dihydroperoxide (DBDH) allow for the detection of creatinine based on the pseudoperoxidase activity of copper–creatinine complexes, producing a color change from orange through green blue. Quantitative results may be estimated according to charts as 10, 50, 100, 200, or 300 mg/dL and 0.9, 4.4, 8.8, 17.7, or 26.5 mmol/L. Falsely elevated results can be caused by bloody or colored urine and cimetidine. The absence of creatinine in readings is considered abnormal, as it is usually found in concentrations of 10–300 mg/dL (Strasinger and Di Lorenzo 2014).

According to several studies, UACR measured in a morning spot urine correlates well with the timed excretion rate (Ginsberg et al. 1983; Heerspink et al. 2010). One trial reported that a random UACR above 30 mg/g had a sensitivity of 100 % for the detection of moderately increased albuminuria (Nathan et al. 1987). Usually, albumin concentration is measured in mg/dL and is divided by the creatinine concentration also in mg/dL, yielding a dimensionless number that estimates the 24-h albumin excretion in grams per day (Shidham and Hebert 2006). If SI units (mg/mmol) are used, the value is divided by 8.8.

The quantification of UACR is mandatory in patients with AKI and CKD and all patients with a previously elevated semiquantitative result. On the one hand, it is imperative to exclude a false positive; on the other hand, the quantification of excretion is essential in establishing the diagnosis, prognosis, and monitoring response to treatment.

The optimal time to measure the UACR remains uncertain (Wald 2014). In an initial study, the best correlation with a 24-h collection occurred with first morning and before bedtime samples (Ginsberg et al. 1983). A larger study also found the best correlation for the first morning void, although the difference regarding other timings was not significant (Witte et al. 2009). There is some evidence that random untimed samples may be more prone to error because they generally show higher intra- and interindividual coefficients of variation than those found for the first or second morning samples (Witte et al. 2009; Naresh et al. 2012). In this setting, one report (Saydah et al. 2013) showed that random urine samples appear to overestimate the prevalence of albuminuria compared to first morning collection. Additionally, creatinine excretion variance may also influence both random spot and 24-h sample results. According to the rates reported before, the average 24-h urine creatinine excretion is approximately 1000 mg/day (11.4 mmol/day) per 1.73 m2, so this ratio accuracy is diminished if creatinine excretion is either markedly higher or lower than the average population. The groups more prone to error are individuals with large muscle mass, male gender, black/hispanic race, and younger in age, who may exhibit a much higher creatinine excretion rate than 1000 mg/day, and spot UACR will underestimate a 24-h albuminuria, while among patients who have small muscle mass like cachectic, older, and females, creatinine excretion may be much lower, and 24-h results overestimated. Given these UACR limitations, several investigators have developed an “estimated albumin excretion rate” (eAER) to more accurately predict the 24-h albumin excretion (Abdelmalek et al. 2014; Fotheringham et al. 2014). The eAER can be calculated by multiplying the spot UACR by the expected 24-h creatinine generation. This method may be particularly important in the groups mentioned above.

In conclusion, KDIGO Guidelines recommend screening for moderately increased albuminuria among diabetic patients for early detection of nephropathy and nondiabetic patients at increased risk for CKD or cardiovascular disease, such as those with hypertension and metabolic syndrome (KDIGO 2013).

Protein

Proteinuria is the hallmark of renal disease because it often implies an increase in glomerular permeability, which allows the filtration of normally non-filtered macromolecules. According to the previous evidence described for albuminuria, proteinuria of increasing severity is associated with a faster rate of decline in GFR, regardless of its baseline (Turin et al. 2013), is an independent risk predictor for ESRD (Iseki 2013), and is associated with the risk of myocardial infarction and mortality (Hemmelgarn et al. 2010).

Proteinuria is defined as urinary protein excretion of more than 150 mg/day (10–20 mg/dL). Normal urinary proteins include albumin, serum globulins, and Tamm–Horsfall protein (uromodulin) secreted by the renal distal tubular epithelial cells. Proteinuria can be organized into different categories according to the source and pathogenesis of the defect, as well as to its clinical significance. It may be divided into glomerular, tubular, overflow, and post-renal proteinuria. Patients may have more than one type of proteinuria. Glomerular proteinuria is due to increased filtration through the glomerular capillary wall. It is a sensitive marker of glomerular disease and hypertension, where increased pressure from the blood entering the glomerulus overrides the selective filtration of the glomerulus, as well as all types of GN where different kinds of membrane lesions occur. More benign causes, such as orthostatic, high fever, or exercise-induced proteinuria, usually cause an isolated, transient, and asymptomatic proteinuria, less than 1–2 g/day (Wald 2014). Differential diagnosis of orthostatic proteinuria is carried out when a patient is requested to empty the bladder before going to bed, to collect a specimen immediately upon rising in the morning (negative), and to collect a second one after standing for several hours (positive) (Strasinger and Di Lorenzo 2014). Tubular proteinuria includes low-molecular-weight proteins, such as beta2-microglobulin, immunoglobulin light chains, retinol-binding protein, and polypeptides derived from the breakdown of albumin, which have molecular weights under 25,000 Daltons in comparison to the 69,000 Daltons of albumin. These smaller proteins can be filtered through the glomerulus and almost completely reabsorbed in the proximal tubule. Interference with proximal tubular reabsorption due to tubulointerstitial diseases, or even some primary glomerular syndromes, can lead to increased excretion of these smaller proteins (Carter et al 2006). Tubular proteinuria often fails to be diagnosed since dipstick is not sensitive for non-albumin proteins, and they are usually found in low concentrations. Mild increased excretion of immunoglobulin light chains (Bence Jones proteins) is polyclonal (Kappa and Lambda), and not injurious to the kidney.

In contrast, the monoclonal nature of the light chains in the overflow proteinuria seen in multiple myeloma is highly nephrotoxic (Wald 2014). This type of proteinuria results from conditions affecting the plasma prior to reaching the kidney and is therefore not indicative of actual renal disease, so it may also be classified as prerenal proteinuria. The most common conditions that lead to overflow proteinuria are frequently transient. It is caused by increased levels of low-molecular-weight plasma proteins that exceed tubular reabsorptive capacity, like acute phase reactants in infection and inflammation. Common examples, other than myeloma kidney, are due to lysozyme in acute myelomonocytic leukemia, myoglobin in rhabdomyolysis, and free hemoglobin in intravascular hemolysis (Barratt and Topham 2007). Patients with myeloma kidney may also develop a component of tubular proteinuria since the excreted light chains may be toxic to the tubules, leading to diminished reabsorption (Wald 2014). Based on false-negative dipstick results, suspected cases of multiple myeloma must be diagnosed by performing serum and urinary immunoelectrophoresis.

Post-renal proteinuria may be caused by inflammation in the urinary tract (often in UTI, but also in nephrolithiasis or tumors), although the mechanism is unclear. The excreted proteins are often non-albumin (often IgA or IgG), in small amounts, and usually accompanied by leukocyturia (Wald 2014).

Regarding measurement techniques, there are two semiquantitative methods to screen patients for proteinuria: dipstick (similar to albuminuria) and sulfosalicylic acid (SSA) precipitation.

Protein dipstick principles and main interference conditions are similar to those described above in the section on “Albuminuria.” Although very specific, the dipstick test is not sensitive to low levels of proteinuria (<10–20 mg/dL) or low concentrations of γ-globulins and Bence Jones proteins. A result of 1+ corresponds to approximately 30 mg/dL, 2+ to 100 mg/dL, 3+ to 300 mg/dL, and 4+ to 1,000 mg/dL (House and Cattran 2002). Dipstick urinalysis can reliably predict proteinuria with sensitivities and specificities of greater than 99 % (Woolhandler et al. 1989).

In contrast to the dipstick test, which primarily detects albumin, SSA is a cold precipitation test that detects all proteins in urine at a sensitivity of 5–10 mg/dL (Rose 1987). A significantly positive SSA test in conjunction with a negative dipstick often indicates immunoglobulin light chain excretion caused by dysproteinemias. Various concentrations and amounts of SSA can be used to precipitate protein, and methods vary greatly among laboratories. It is mostly performed by mixing one part of urine supernatant (2.5 mL) with three parts of 3 % SSA and grading the resultant turbidity according to one scheme given by the manufacturer: 0 or no turbidity corresponds to 0 mg/dL; trace or slight turbidity to 1–10 mg/dL; 1+ or mild turbidity to 15–30 mg/dL; 2+ or white cloud without precipitate to 40–100 mg/dL; 3+ or white cloud with fine precipitate to 150–350 mg/dL; and 4+ or flocculent precipitate to >500 mg/dL (Strasinger and Di Lorenzo 2014). The SSA test will be overestimated by as much as 1.5–2 g/L in the presence of iodinated radiocontrast agents (Morcos et al. 1992), penicillins, sulfisoxazole, and with gross hematuria (Simerville et al. 2005).

Both the SSA and dipstick test can detect urinary lysozyme. Total lysozyme excretion is usually below 1 g/day but can exceed 4.5 g/day in some patients with acute monocytic or myelocytic leukemia (Mok et al. 1994). Thus, lysozyme excretion should be measured in patients who have a persistently positive urine dipstick for proteinuria in the absence of albuminuria, particularly if other signs of the nephrotic syndrome are absent.

Once again, patients with persistent proteinuria should undergo a quantitative measurement. As for albuminuria, the 24-h sample is the gold standard method. Nevertheless, based on the assumption that the potential error in determining proteins in a spot urine sample does not exceed the 24-h sample error (Morales et al. 2004), the protein/creatinine ratio (P/C) in spot urine was developed as an alternative (Methven et al. 2010). The same sample can even be used for microscopic investigation. Nevertheless, although the correlation between P/C ratio and 24-h proteinuria has been established, authors are not unanimous (Birmingham et al. 2007), because the results may be influenced by factors like a creatinine excretion deviation from the average. Other authors suggest that this correlation varies in accordance with different levels of proteinuria (Methven et al. 2011), with the majority finding acceptable agreement in the range 0.5–2 g/day (Leung et al. 2007), but not for nephrotic range (Antunes et al. 2008). Moreover, some data also revealed that this correlation is not reliable for some diseases. One example is lupus nephritis (LN), where the accurate evaluation of proteinuria is critical to clinical management, because it is currently the most important available biomarker of disease activity and renal prognosis. Furthermore, proteinuria is often a primary end point in clinical trials of new therapies and therefore must be measured with precision. Based on the possible lack of correlation in this disease, short-interval timed urine collections have been studied as a surrogate for 24-h collections to increase patient compliance and improve accuracy of the results. One study found that a 12-h overnight collection provides a more accurate result than shorter ones (Fine et al. 2009). Although a random spot P/C measurement may not be reliable to make decisions for patients with LN, it may be useful as a screening and monitoring test justified by its advantages concerning facility, reliability, accuracy, and diagnostic speed (Guedes-Marques et al. 2013). It could be used as the preferential marker in subgroups of subjects who find it more difficult to properly collect 24-h urine, such as children, elderly people, patients with intellectual disabilities, incompatible professional activities, or lack of adherence. Another factor to be considered is the timing of the sample, which is influenced by the daily circadian fluctuation of both protein and creatinine excretion. According to some data, the best estimation is probably obtained with morning samples, but not the first void (Saydah et al. 2013).

The technical and methodological details about spot P/C ratio are described in the section on “Albuminuria.”

Proteinuria selectivity can be assessed in nephrotic patients through the ratio of IgG clearance (160,000 D) to transferrin clearance (88,000 D). Although infrequently used, highly selective proteinuria (ratio <0.1) in nephrotic children suggests the diagnosis of minimal change disease and predicts corticosteroid responsiveness (Johnson et al. 2014).

In conclusion, some authors consider that a normal spot P/C ratio is sufficient to rule out pathological proteinuria, but that an elevated value should be confirmed and quantified with a 24-h collection (Price et al. 2005). Other investigators have reported a poor correlation at high levels of proteinuria, as well as in some diseases like LN, where this ratio is useful to monitor so a 24-h ratio is recommended before treatment decisions are made.

Uric Acid

Hyperuricemia and gout correlate with risk factors for cardiovascular disease. They are caused by an overproduction and/or inefficient renal clearance of urate.

The fractional renal clearance of urate (FCU, renal clearance of urate/renal clearance of creatinine) expresses urate clearance as a fraction of creatinine clearance. It provides information about the efficiency of the renal tubular mechanisms of urate clearance by correcting for the effect of the glomerular filtration rate (Kannangara et al. 2012). This spot-FCU has been demonstrated to be a convenient, valid, and reliable indicator of the efficiency of the kidney in removing urate from the system. Spot-FCU has been used in studies investigating molecular mechanisms of kidney clearing urate from the blood and so identifying people at increased risk for cardiovascular disease.

Furthermore, urinary uric acid assessment is necessary for the early diagnosis and adequate treatment of urolithiasis. A recent paper (Sáez-Torres et al. 2014) proposed the analysis of late-afternoon spot urine collection as an appropriate sample to evaluate patient-specific urinary risk factors.

Uric acid excretion is more favorable in alkaline urine. Based on this evidence, dietary intervention in metabolic syndrome, focused on hyperuricemia, recommends taking alkaline-rich fruit and vegetables which could be monitored through urinary pH and uric acid excretion in a random spot sample.

Electrolytes

Urinary excretion of electrolytes can be very useful for diagnosing and monitoring several diseases, as it mainly distinguishes between systemic and kidney-limited disorders. The first condition is caused by abnormal levels of serum electrolytes, to which the kidney tries to adapt its function to maintain homeostasis. The second condition is found with renal tubular disorders with abnormal electrolyte excretion.

Once more, the electrolyte/creatinine ratio in a random spot urine was found to be an alternative to 24-h measurements. Furthermore, as described for uric acid, a random spot urine sample may also be used to measure the fractional renal clearance of an electrolyte, which expresses its excretion corrected for creatinine clearance. The main practical disadvantage of this method is that a blood sample is required, which is invasive and inconvenient to the patient.

Sodium

In steady-state conditions, the kidneys handle most of the sodium (Na) consumed in a day, and the majority (up to 95 %) is excreted in urine within 24 h. A systematic review of studies comparing 24-h and spot urine collections for estimating population salt intake (Ji et al. 2012) found that most available studies are heterogeneous and there is no uniform pool of data to assess the suitability of alternative random spot samples for measuring 24-h Na excretion (salt intake estimation). However, most studies found acceptable reproducibility. In this setting, most authors believe that using random spot urine samples requires a greater number of collections, but it would still be more convenient and feasible for monitoring. So, regarding programs of population salt reduction, this method is less desirable to provide an initial absolute measure of salt intake, but it may be useful in following repeated assessments.

According to one study (Mann and Gerber 2010), late-afternoon or early-evening samples are more reliable in predicting 24-h Na excretion and may be a cost-effective alternative in salt intake assessment in clinical practice and epidemiological studies.

Potassium

A study with the INTERSALT trial population (Tanaka et al. 2002) demonstrated that the ratio of Na or potassium (K) to creatinine concentration in a random spot specimen is directly proportional to the 24-h ratio. Furthermore, a Na:K ratio may be more reliable in estimating increased risk in cardiovascular disorders such as arterial hypertension.

Phosphate

Calcium (Ca)/creatinine and phosphate (Ph)/creatinine ratios in random spot urine specimens were also found to be a reliable method for estimating daily urinary Ca and Ph excretion (Gokçe et al. 1991) useful in evaluating mineral bone disease in CKD, diagnosis and monitoring of urolithiasis and in the differential diagnosis of hereditary diseases of renal tubular transport (Bartter, Gitelman and Fancony syndrom).

Others

One study (Ilich et al. 2009) compared the concentrations of essential elements magnesium (Mg) and zinc (Zn) in 24-h and spot urine samples and found that although spot urine sampling might not replace 24-h measurements in all cases, it could be a reliable alternative in those evaluated. Urinay chloride measured in a spot sample is quite useful to evaluate volume and acid-base status and establish the differential diagnosis between extra-renal losses of Na (very low chloride like 1 mmol/L) and renal tubular disorders (> 20 mmol/L).

A significant number of other substances may be detected from random spot urinalysis. This is useful for detecting drug consumption or toxicity (alcohol, benzodiazepines, others), fibroblast growth factor 23 (FGF23), and a phosphaturic hormone to assess bone mineral disease in CKD, among many other approaches less widely used and explored.

Microscopic Markers

Microscopic evaluation is an essential part of spot urine analysis because it provides confirmation of urine dipstick findings and also allows the identification of formed elements that are not evaluated through other methods. Their quantification is mostly semiquantitative, and there is a general interobserver variability, which fails to make them ideal markers. As an example, one study shows only fair to moderate agreement among nephrologists in identifying important structures (Wald et al. 2009). Nevertheless, their presence in urine can be evaluated through a random spot sample and is often crucial to point toward a specific diagnosis.

Regarding specimen handling, it should be examined while fresh or adequately preserved, because elements like RBCs or WBCs and hyaline casts disintegrate rapidly, particularly in dilute alkaline urine. Refrigeration may cause precipitation of amorphous urates, phosphates, and other non-pathological crystals that can obscure other elements in the urine sediment.

The second morning voiding is the most appropriate to obtain because it avoids the lysis of particles that can occur in the bladder overnight. The midstream clean-catch specimen minimizes external contamination and bacterial growth without pathological significance. As for other techniques, diluted specimens may cause false-negative readings.

Concerning sample preparation, after centrifugation, the supernatant is removed. After this, the sediment is resuspended, then transferred to the slide, and prepared using a coverslip. Phase contrast microscopy is recommended because it improves the identification of almost all particles, whereas polarized light is mandatory for the correct identification of some lipids and crystals (Fogazzi 2010). For correct examination, both pH and SG of the sample should be known, because both alkaline pH (≥7.0) and low SG (especially <1.010) favor the lysis of cells, which causes discrepancies between dipstick and microscopic examination. The various elements observed are quantified as number per microscopic field, and if counting chambers are used, the elements are quantified as number per milliliter (very precise, but rarely used in everyday practice) (Strasinger and Di Lorenzo 2014).

The main elements formed are cells (erythrocytes, leukocytes, and epithelial), lipids, casts and crystals.

Erythrocytes

The assessment of these elements may play a key role in the isolated hematuria diagnostic approach. Although there is no agreement on hematuria definition criteria, it is commonly defined as the presence of two or more RBCs per high-powered field (Cohen and Brown 2003). Additionally, microscopic evaluation may distinguish two main types of erythrocytes: isomorphic, with regular shapes and contours, derived from the urinary excretory system, and dysmorphic, with irregular shapes and contours, which have a glomerular origin (Fig. 2). Thus, according to their relative proportion, hematuria may be defined as non-glomerular or glomerular, respectively. However, this proportion threshold is not unanimous between authors, either.

Fig. 2
figure 2

Microscopic observation of dysmorphic erythrocytes. Microscopic observation (×400) of dysmorphic erythrocytes in a random spot urine sample of a patient with glomerulonephritis (Kindly provided by Drª. Fernanda Carvalho from Curry Cabral Hospital)

As was already described above (see chemical parameters – blood, above), among the most common pathological causes of microscopic hematuria are kidney stones, malignancy, and glomerular disease.

Leukocytes

More than simply identifying leukocytes by chemical analysis, it is valuable to distinguish the different types that exist because they may point to specific diseases. Neutrophils usually mean bacterial UTI, but they may also result from urine contamination caused by genital secretions, especially in young women. Nevertheless, they can also be found in interstitial nephritis and proliferative GN. In both conditions they are lower in number, and in GN they are mostly accompanied by dysmorphic erythrocytes and proteinuria. Eosinophils can be detected by applying Wright’s or Hansel’s stain to the urine sediment (Nolan et al. 1986) and are currently seen as a nonspecific finding because they may be present in various types of GN, prostatitis, chronic pyelonephritis, urinary schistosomiasis, and cholesterol embolism. Lymphocytes may indicate acute cellular rejection in renal allograft recipients. Macrophages may be engorged with lipid droplets, appearing as “oval fat bodies,” usually seen in nephrotic syndrome, but may also appear in other settings like Fabry disease.

Cells

Different kinds of cells (renal tubular epithelial, transitional epithelial, and squamous) can be found in normal or pathological conditions according to their proportion and presentation. Epithelial cells may appear in the urine after being shed from anywhere within the genitourinary tract and are mostly common in ATN.

Lipids

Lipids are found in urine as drops of different sizes that can be isolated or in clusters, as oval fat bodies (Hotta et al. 2000). All these particles mainly contain cholesterol esters and free cholesterol, showing a Maltese cross appearance with symmetric arms, under polarized light. These lipids are typical of glomerular diseases associated with nephrotic range proteinuria and Fabry disease.

Casts

Casts are cylindrical structures formed in the lumen of distal renal tubules and collecting ducts (Fig. 3). Their matrix is made of Tamm–Horsfall glycoprotein (uromodulin), and they have different appearances, each of which has different features and a specific clinical significance summarized in Table 1.

Fig. 3
figure 3

Microscopic observation of a large granulous cast. Microscopic observation (×400) of a large granulous cast in a random spot urine sample of a patient with chronic kidney disease (Kindly provided by Drª. Fernanda Carvalho from Curry Cabral Hospital)

Table 1 Individual features and clinical significance of urinary casts

Crystals

Correct identification of urine crystals requires a wide knowledge of crystal morphology and appearance under polarizing light. Whether crystals form in the urine depends upon a variety of factors, including relative concentration of constituent molecules, urine pH, and the presence of crystallization inhibitors. Examination of urine for crystals is informative in the assessment of patients with stone disease, some rare inherited metabolic disorders (like cystinuria, oxalosis, phosphoribosyltransferase deficiency), and suspected drug nephrotoxicity (Fogazzi 2010). The main individual features and clinical significance of crystals are summarized in Table 2.

Table 2 Individual features and clinical significance of urinary crystals

Organisms

Bacteria are often seen in urine, but their clinical significance is generally guided by patient symptoms and other urinary features like the presence of neutrophils (Fig. 4). Fungi are also frequent (Fig. 4). Examiners should be aware that specimen handling is vital to this examination because a sample that is not fresh can be contaminated.

Fig. 4
figure 4

Microscopic observation of fungus, bacteria, and leukocytes. Microscopic observation (×400) of fungus, bacteria, and leukocytes in a random spot urine of a patient with sepsis and acute kidney injury (Kindly provided by Drª. Fernanda Carvalho from Curry Cabral Hospital)

Conclusions

It is unusual to only focus on a single risk marker for the development and progression of a disease. In this setting, random spot urine analysis may provide the identification of several markers at just one moment, without an invasive procedure. There is a need to develop broader models of progressive kidney diseases and the relationship they have with the cardiovascular background, which include novel pathways and risk markers apart from those related to the traditional proteinuric pathway. In addition, more trials are needed to find out which sample is most appropriate for analysis and if this alternative method is reliable in all types of diseases.

Finally, many other biomarkers, like genetic fragments and cytokines, among others, can already be measured in a random spot urine sample, but their comparison with a 24-h sample measurement and many of their applications is still not reported.

Potential Applications to Prognosis, Other Diseases, or Conditions

Random spot urine examination is already much more than a complementary exam limited to the nephrology field. In this setting, currently identified urinary markers have been found to be associated to other diseases, other than kidney abnormalities. Moreover, new biomarkers are being identified to help in the early detection of cancer, tuberculosis, HIV, malaria, and potentially many other diseases.

In the future, random spot urinary genetic markers, as well as others, may predict chronic or acute events, even before traditional laboratorial and clinical signs become positive. Adding to this “risk” and “diagnostic” value, future biomarkers may contribute to novel disease pathogenesis models, which could be crucial in developing new therapeutic targets and activity markers.

Maybe one day, one simple occasional urine sample could be used to screen for diseases like oncological, autoimmune, metabolic, or other diseases, immediately after birth. Additionally, other markers could equally be associated to future specific therapy responses, helping to decide the best treatment for each patient according to their individual profile.

Summary Points

  • This chapter focuses on the analysis of random spot urine, which can be easily obtained at any time of the day with only one urination. It allows several, otherwise asymptomatic, markers to be identified and avoids 24-h collection-related errors.

  • The collection and handling of the urine sample is crucial to avoid errors and both the suitability and the rejection criteria must be determined.

  • Random spot urine markers can be organized into three groups, according to their laboratory method assessment: physical, chemical, and microscopic markers.

  • Physical markers include color, odor, clarity, and specific gravity.

  • Chemical examination includes the identification of protein, blood, glucose, pH, bilirubin, urobilinogen, ketones, nitrites, and leukocyte esterase. They are probably the most useful in daily practice, because their value in the diagnosis, monitoring, and prognosis of glomerular diseases has long been proven.

  • Microscopic evaluation entails the detection of crystals, cells, casts, and organisms.

  • Chemical results are the most often widely used, because of their (semi)quantitative presentation and possible correlation with quantitative timed collections, but physical and microscopic ones may play a key role in many differential diagnoses and monitoring.