Introduction

Telemedicine involves the delivery of healthcare and related information over long distances by combining biomedical signals with information technology and means of communication. The 21st century has witnessed the advent of mobile telemedicine [16], in which the same services are delivered at anytime from anywhere using high-speed and high-reliability mobile communication systems. Such mobile communication systems can enable the delivery of improved medical services, and they can be applied to an emergency ambulance service, mobile hospital (M- hospital), general healthcare, early warning systems for diseases, and illness rehabilitation. Consider a situation in which an ambulance is dispatched from a hospital to respond to a medical emergency; the ambulance carries, only the paramedics. While the paramedics provide pre-hospital medical care enroute to the hospital, videos, images, electrocardiogram (ECG), pulse biomedical signals of the patient can be transmitted to an emergency room via Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), and third generation (3G) mobile cellular systems, WiMAX, and mobile satellite communication (MSC) systems. Emergency room physicians can thus gain an understanding of the physiological status of the patients while they are still in the ambulance, and prepare various medical resources for the patient.

E-hospitals, and digital hospitals can enable medical care personnel, to improve the efficiency and quality of service (QoS) of medicine. As a next step, efforts should be directed toward realizing an M-hospital. Doing so would involve addressing certain interesting issues: for example, how wireless communication systems can avoid interfering with biomedical sensing instruments, while being applied to integrate these instruments. Healthcare systems can employ wireline or wireless communications technology to link a patient’s home and a hospital. Doctors can use healthcare systems to provide early warnings of diseases, and for illness rehabilitation. Finally, healthcare systems can enable a decrease in the average length of patient stay, and a decrease in the number of patients visits to the hospital.

The remainder of this paper is organized as follows. Maritime telemedicine describes maritime telemedicine systems that employ satellite communications. Cellular mobile telemedicine describes cellular mobile telemedicine systems, and Short-range wireless telemedicine describes short-distance wireless telemedicine systems. Discussion and Conclusion presents the discussion and conclusions of this paper.

Maritime telemedicine

There is a strong requirement for emergency medical services that can provide personal healthcare to travelers on aircraft, and ships, and to people who live in remote locations such as islands, mountains, and tropical rainforests. For this purpose, these studies [1, 8, 10, 11, 13, 14, 17] can adopt interactive medical video conferencing with the transmission of vital signs using a seamless MSC technique. The data transmission rate of second-generation MSC systems is 10 k–100 k bps; for example, the international maritime satellite communication system (IMMARSAT), has a data transmission rate of 24 kbps. In addition, the data transmission rate of third generation MSC system is 100 k–100 M bps. In this section, we discuss ETS-V [8], MERMAID [10], ACTS [11], TelePACS [13], MEDI [14], satellite wideband code division multiple access (SW-CDMA) [1], and satellite orthogonal frequency multiplexing access (SOFDM) [17].

ETS-V maritime telemedicine system

The Engineering Test Satellite-Five (ETS-V) maritime telemedicine system [8] supports clinical diagnosis and emergency medical services for travelers on aircrafts and ships. ETS-V is a synchronous communications satellite of 150-kg that was launched from the Tanegashima Space Center in 1989. Murakami et al. [8] discussed the channel capacity, system size, transmission bit error rate (BER), real-time operation, and electromagnetic interference of this mobile telemedicine system. They proposed the use of ETS-V to transmit one color image, one audio, three channel ECG, and blood pressure. The characteristics of these biomedical signals are as follows. The color-video signal contain one 8-bit, 256 × 256 pixels color image per 20 s, with a compression rate of 10:1, the transmission rate of this signal is 8 kbps. The audio signal contains 6000 8-bit audio samples per second, with a compression rate of 4.8:1; the transmission rate of this signal is 10 kbps. Each channel of the ECG signal contains 200 samples per second, with a compression rate of 8:1; the transmission rate of this ECG signal is 600 bps. The blood pressure signal contains 16-bit sample per minute, with a compression rate of 1:1, the transmission rate of this signal is 0.3 bps. The ETS-V maritime telemedicine system includes a video camera, a microphone, a sound amplifier, a portable cardiograph, an automatic blood pressure measurement device, and a computer with a display. It occupies a 1 space of 1 m 3, and it weights 30 kg. ETS-V employs channel coding, and ARQ retransmission to reduce the BER. Buffers are used to solve the problems of limited channel capacity, retransmission delay, and real-time operation. The transmission specifications of the ETS-V maritime telemedicine system are listed in Table 1.

Table 1 Transmission specifications of the ETS-V maritime telemedicine system [8]

MERMAID maritime telemedicine system

The MERMAID maritime telemedicine system [10] was developed by the European Union (EU) in 1998. MERMAID uses INMARSAT links and the Integrated Services Digital Network (ISDN) to mainly support medical emergency aid for the shipmen; however, it can also be used to aid travelers. MERMAID includes features such as telemedicine conference, and multimedia communication. Most ocean-going vessels, and transoceanic merchant ships are still not equipped with maritime telemedicine systems. Among approximately 20,000 maritime medical emergencies, to be 37% involved shipmen suddenly becoming ill. Through a survey, it was found that roughly 43% of shipmen desire access to maritime telemedicine, and healthcare systems. The MERMAID system was launched in response to these requirements. MERMAID uses communication media such as optical fiber, coaxial cables, satellites, cellular radio, wireless local area network, ultra-wideband (UWB), and Bluetooth, along with network technologies such as asynchronous transfer mode (ATM) network, ISDN, and the Internet. It can operate in one of two modes. The basic mode has a low data transmission rate as well as low channel bandwidth, and it is used to transmit medical records. The advanced mode has a high data transmission rate as well as high channel bandwidth, and it is used to support interactive telemedicine conferencing, and to transmit ECG signals.

ACTS maritime telemedicine system

Maritime telemedicine systems [11] employed ground and satellite-based integrated networks for the high-speed transmission of medical images and data. By the winter of 1992, the National Aeronautics and Space Administration, (NASA), USA conducted maritime telemedicine experiments using the Advanced Communications Technology Satellite (ACTS) [12]. The uplink and downlink signals of ACTS were transmitted in the Ka band with different carrier frequencies, and it adopted the ATM data transmission protocol to realize a data transmission rate of up to 622 Mbps. ACTS was used to carry out a blood vessel radiography experiment and a family medicine clinical experiment. Blood vessel radiography is a clinical technique used to study the heart of a patient, wherein a fluorescent chemical agent is injected into the blood vessel; this agent travels to the heart, enabling a clear image of the condition of serum and blood vessels to be obtained using X-ray images. In the blood vessel radiography experiment, mobile clinical X-ray images of two distinct areas of the heart were transmitted and discussed over the ACTS system. The data transmission contained 30 8-bit, 512 × 512 pixel monochrome digital X-ray images in the digital imaging and communications in medicine (DICOM) medical image format, with a data transmission rate of 60 Mbytes/sec. In the family medicine clinical experiment, the medical signals of a patient were measured using various instruments such as a cardiograph and an X-ray machine, and then transformed into digitals for being stored on a server; the signals were then transmitted using ACTS system. A number of digital X-ray images of the patient, with a size of 1–500 Mbytes, were relayed to the destination via the satellite within the shortest time. TCP parameters such as the buffer size and congestion window were both set to 15 Mbytes; and 1.5 Mbytes in the respective experiments.

TelePACS maritime telemedicine system

The TelePACS maritime telemedicine system was developed by Hwang et al. [13] in 2000. Their WWW-based TelePACS can access every permitted picture archiving and communication system (PACS) server over the Internet. TelePACS was developed using the Java programming language and it can be used to brows a vast amount of medical data and images using browsers such as Netscape. TelePACS was designed to realize online and offline real-time telemedicine applications. This system includes a computer, a modem for receiving medical data over the Internet, and decoded data signals using QPSK demodulation, an MPEG2 multiplexer decoder and a peripheral component interconnect (PCI) bus interface. The received data was inputted into the frequency modulator and used to modulate the carrier wave, The modulated signal was inputted into the analog-to-digital converter to convert it to a digital bit stream, QPSK demodulation and error correction was then applied to the output digital bit streams. The demodulated signal was inputted into the MPEG2 multiplexer decoder to acquire the audio and video digital signals. These signals were converted into analog voice and video signals. The TelePACS picture storage communication system was developed mutually by Shinchon Severance Hospital, an affiliated hospital of Yonsei University, and Mediface Corp. The DICOM format medical image outputs obtained by computed tomography (CT), and magnetic resonance imaging (MRI). Users can browse these medical images using a web browser and store them over HTTP. The medical image reader enables users to zoom in and out or the entire area of the medical image. The Neck vendor employed a 20:1 static image compression standard (JPEG) and a 20:1 wavelet image compression standard for CT images. The average peak signal-to-noise ratio (PSNR) of these medical images was approximately 30–35 dB. This study [13] also discussed the FTP transmission performance of the TelePACS system.

MEDI maritime telemedicine system

The MEDI [14] maritime telemedicine system is used to provide medical services to users at any place, and any time via a next-generation mobile satellite system. The maritime telemedicine communication technique complements the cellular ground mobile telemedicine technique as opposed to competing with it. This is because present cellular ground mobile techniques utilize only a restricted part of the spectrum. This study [14] integrates cellular wired/wireless communication networks, to form a seamless mobile telemedicine network. Currently, satellite systems provide several types of multimedia communication services such as satellite broadcasted TV, broadband relay transmission and MSC systems. The MEDI maritime telemedicine system is developed using the Java programming language, and therefore, it is platform independent; Java can be compiled to an HTML document, thus enabling web page applications. Furthermore, because it adopts the TCP/IP protocol, it allows the use of any web browser and provides users with a broadcast linkage service. The MEDI maritime telemedicine system enables a medical image database to be searched and supports remote medical conferencing. The former allows data to be stored and searched from a medical image database containing patient medical images and to conduct X-ray image processing. The latter enables medical diagnostic images to be observed and discussed in real-time. The MEDI maritime telemedicine system is a web page application contained in an HTML document. All medical images are in the Papyrus 3.0 format; this format is compatible with DICOM 3.0. The MEDI user sub-system enables remote server browsing via HTTP. The remote server comprises three sub-servers: (1) data-base server, (2) HTTP server, and (3) physician server. The database server stores patient data such as sex, age, history, health insurance number and so on; the HTTP server buffer these data and displays them on the user screen; The server buffer can stores biomedical signals to achieve online or offline interactive mobile telemedicine services; the physician server allows remote medical conferences to be conducted, with Papyrus 3.0 medical image format being used to manage and store medical documents from the remote medical conference. The MEDI maritime telemedicine system employs the Eutelsat satellite system [15] and the ITALSAT satellite system [16] for the downlink and uplink from a mobile device, respectively. The Eutelsat satellite system comprises 25 synchronized satellites that are distributed in orbit from 15° W longitude to 70.5° E longitude. The ITALSAT, which is the first Italian satellite, carries 10 sets of communication relays for 3 communication links from 30/20 GHz and 50/40 GHz, and it provides capacity for 12,000 telephone calls. It has dimensions of 2.3 m × 2.7 m × 3.5 m and it weighs roughly 900 kg. In the MEDI system, the round-trip time is 600 ms, and the packet loss rate is 0.014%. FTP transmissions of 10, 100, 1,000 KB files have a data transmission rate of 20 kbps, and no packet loss rate. The HTTP transmission time required for digital X-files having a size of 135 KB is 15 s, and that for 12-MB video files is 95 s.

Satellite wideband code division multiple access maritime telemedicine system

This study [1] now discuss a Ka band satellite wideband code division multiple access (SWCDMA)-based maritime telemedicine system. This study adopt orthogonal variable spreading fact codes (OVSF), pseudo noise codes, unequal error protection, adaptive modulation, object-composition perti-net (OCPN) model, multi-satellite transmission schemes, direct mapping (DM) or space time block code (STBC), and a power assignment mechanism to transmit biomedical signals. OVSF, pseudo noise code, unequal error protection, and adaptive modulation are used in WCDMA-based third generation mobile cellular systems. In the mobile telemedicine system, we assume that the acceptable BERs for data, audio, and video packet is 10-7, 10-3, and 10-4, respectively [7]. Data packets with a low BER are provided with a high transmission power, high-level error protection scheme, low-level modulation, and an STBC with the multi-satellite transmission strategy. In contrast, data packets with high BER are provided with a low transmission power, low error protection scheme, high-level modulation, and a simple DM with the multi-satellite transmission strategy. Patients and doctors can use micro-phones and charge-coupled device (CCD) image sensors to carry out an interactive conference. Doctors provide a diagnosis based on the history of a patient, body temperature, pulse, ECG, and EEG biomedical signals, which are transmitted over the SWCDMA maritime telemedicine platform. This study [1] transmit biomedical signals including the blood pressure, and body temperature, and 108-kbps bit streams of the 12 channel ECG signals and 262.114 kbps bit streams of the 64 channel EEG signals of every patient are converted to data bit streams. G.729 encoder is employed to compress the 64 kbps audio signals to 8 kbps audio bit streams, an H.264 encoder is adopted to convert the 13.15 Mbps video signals into 1.13 Mbps video bit streams, and JPEG2000 is used to compress the 3,640 KB X-ray medical image signals to form a 128 KB image bit stream. In the transport architecture [1], the data, audio, and video bit streams compose data, audio, and video packets, respectively. In addition, a patient requires a model by which temporal constraints among various data objects that must be observed at the time of playback can be specified. For this purpose, this study [1] adopt the well-known Object- Composition Petri-Net (OCPN) model in this system. Because the OCPN model can specify the throughput resulting from the transmission of concurrent multimedia objects, the sum of the data, audio, and video packets can be calculated, and a real-time telemedicine conference can be realized. We also discuss a design concept of the satellite OFDM-based mobile telemedicine system [17].

Cellular mobile telemedicine

First generation cellular ground mobile communication systems (CGMCSs) aimed to provide analog voice services, second generation CGMCSs aimed to provide digital voice services, third generation CGMCSs aimed to provide mobile multimedia services, and fourth generation CGMCSs aimed to provide applications requiring high speeds and large communication bandwidths. These CGMCSs include GSM, personal handyphone system (PHS), WCDMA, OFDM, and WiMAX. In the section, we discuss the application of these CGMCSs in the AMBULANCE [18], WAP [19], Airmed-Cardio [20], Teletrauma [21], UMTS [22], and WiMAX [24] telemedicine systems.

AMBULANCE telemedicine system

When an ambulance arrives at the site of a medical emergency, paramedics provide pre-hospital medical care while transmitting images, and the vital signs of the patient to the emergency ward in real-time and carry out real-time interactive conference with the physician in the emergency room. Here, it is necessary in increase the gold rescuing time. The AMBULANCE mobile telemedicine platform includes a camera, electrocardiograph, note-book computer and GSM modem; and it conducts via GSM to the wired and wireless local area network of hospitals. The features of this system are as follows: (1) the mobile device is portable and is meant for use by a single user; (2) large battery life; (3) easy to use; (4) user-friendly interface; (5) supports full duplex communications with the emergency ward in real-time; (6) a computer aided diagnosis tool; (7) encrypted communications; (8) be able to various database of the hospital. This system adopts GSM for both the encryption and for other communications. The design structure of the mobile terminal includes: real-time medicine signal and real-time image acquisition, user commands, system control module, mobile storage module, medicine signal display, data compression and encryption and GSM modem. This work [18] adopt the TCP/IP protocol with a data transmission rate of 9,600 bps. Real-time medical signal transmission including: three-channel ECG signals, blood sugar concentration, and pulse signal, each channel containing 200 8-byte samples per second, with a data transmission rate of 1,600 bps; Image transmissions contain 320 × 240 pixel images having sizes of 2.5–3 KB, that require 3–5 s to transmit.

WAP telemedicine system

This work [19] discusses a telemedicine system that employs the wireless application protocol (WAP). Using WAP 1.1 with the GSM1800 MHz circuit-switching data technique, a mobile platform can store and relay biomedical signals of a patient such as ECG biomedical signals, patient history, hospital messages and physician’s advice, thus introducing a novel feature to mobile telemedicine in what is currently becoming an increasing popular system. The WAP component, WAP router gate and content server adopt HTTP mode. For example, when the WAP telemedicine system is applied to cardiogram browser, heart beat, patient history browser, clinical or hospital messages and physician’s advice are provided. This telemedicine system adopts a 101 × 43 pixel display. These applications are stored in the content server; whenever required, they are downloaded from the content server and the signals are stored in the WAP component using the WML programming language. The external and internal databases store the ECG signals, heart beat, patient history browser, clinical or hospital messages and physician’s advice. These data are transmitted to the common router gate interface via the database interface, to the WAP router gate via the common router gate interface, and to the WAP component via the WAP router gate. The patient information menu flow stores patient information in the WAP component, the acquisition layer can be used to browse the patient information, select general information, and browse cardiogram through the menu. When browsing cardiograms the cardiogram patient number is first displayed, along with the date and time, the user selects the cardiogram patient number and biomedical signals to be displayed, and selects whether to display the next record or adjust the parameter.

Airmed-cardio telemedicine system

The Airmed-Cardio telemedicine system [20] was developed as a patient, healthcare agent and biomedical message transmission platform using GSM and the Internet for heart disease patients. Colonic heart disease and stable patients are classified into four categories: hypertension, arrhythmia, heart failure and heart rehabilitation; A portable recording device and mobile phone can be used to record cardiogram medical signals and transmit them over WAP, when they are outside the range of a hospital mobile healthcare and surveillance system. This telemedicine system is a WWW-based automatic transmission platform that does not require any manual operations. The Airmed-Cardio telemedicine system comprises a patient sub-system, and health administration sub-system. These two sub-systems transmit data by using the GSM network (WAP protocol), and conduct two-way communication. The patient sub-system records heart and physiological parameters, thus enabling remote diagnosis of medical conditions. In patients with arrhythmia, six channel ECG signals and blood pressure were measured using a portable six channel electrocardiograph, and sphygmomanometer, respectively. In patients with heart problems, 12 channel ECG signal, and blood pressure weight were measured using a 12-channel electrocardiograph, and sphygmomanometer, respectively. In patients with hypertension, blood pressure was measured and transmitted three times a day, in the morning, at noon and in the evening. In patients undergoing rehabilitation after heart problem 12-channel ECG signals and pulse were measured once a day. The measurement devices used in this system include Biolog3000i electrocardiograph, Nokia 7110 mobile phone and GSM modem, OMRON blood pressure gauge, and MiniSPOT sphygmomanometer. The mobile patient sub-system transmits data to the GSM network base station through the GSM modem; the GSM network base station then transmits the data to the health administration sub-system through Public Service Telephone Switching Network (PSTN) or the Internet. The received N-channel ECG signals, cardiogram router, SMS router gate, database server, e-mail client end, WWW server, network router, WAP router, and transmit data outward through the GSM modem, Internet and PSTN.

3G mobile telemedicine system

Car accidents are a major cause of death in USA. In order to reduce the number of deaths caused by car accidents and promote the possibility of future recovery, better medical care should be provided between the time at which paramedics arrive at an accident scene and before an ambulance arrives at a hospital. Previously, voice communications over a mobile phone were used between the paramedics and the emergency ward physician to provide relevant pre-hospital medical care. With the increase in the data transmission rates and stability of 3G mobile phones, it is now possible to carry out voice and video interactions for pre-hospital medical care. Therefore, this work [21] develop a solution to realize the real-time delivery of medical are to a patient over a long distance via voice and video signals by making use of real-time medical signals. This technical solution [21] adopts the CDMA technique, which is capable of simultaneously transmitting voice and video signals, medical image and real-time ECG signals. This mobile long distance telemedicine system comprises the patient and hospital sub-systems. The patient sub-system is onboard the ambulance, where patient information is recorded on a notebook computer that is connected to the medical sensors, portable ultrasound machine, and electrocardiograph. This information is then transmitted to the base station through the 3G wireless network, following which the 3G base station transmits the data to the hospital through the Internet. The patient sub-system possesses simple functions for analyzing medical images and ECG signals in case a regional observation is required. The communication is carrier out using server terminal model, where the patient and hospital sub-systems both act as the server, when they are transmitting information. A two-way wireless link is employed to enable communication between the patient sub-system and the 3G base station, whereas the Internet is used to enable communication between the 3G base station and the hospital. At the hospital sub-system, the sound, video, medical image and ECG signals are first decompressed and the information is observed and stored. The patient sub-system includes G.729 audio signals, H.264 video signals, JPEG2000 image signals encoder and decoder and ECG signals acquisition and analysis, and it transmits these signals to the hospital sub-system via 3G wireless transmission technique, where the information is then displayed on screen. With regard to the type of multimedia being transmitted, ECG signals and clinical images have the highest priority in the TCP protocol, whereas video signals have the lowest priority in the UDP protocol.

UMTS mobile emergency telemedicine system

The UMTS mobile telemedicine system transmission [22] can be used to transmit medical information from an ambulance moving at high speed to the hospital. The information transmitted includes audio, real-time video, and ECG signals. We discuss the performance of this third-generation mobile telemedicine solution in terms of the QoS for different transmission services. The objective of mobile telemedicine is to provide medical service at any place and any time without any restriction by making use of a highly mobile, stable platform with a high data transmission rate. With the progress in communication techniques, commercial wired telemedicine systems have already been realized, we are now attempting to realize wireless telediagnosis system. For the design of a real-time wireless telediagnosis system, it is necessary to consider the transmission platform and transmission speed. The data transmission rates of third-generation mobile communication techniques are 2 Mbps indoors, and 384 kbps outdoors, therefore, these techniques are unsuitable for telemedicine applications. We discuss wireless telediagnosis system platforms on the basis of factors such as real-time operation, transmission reliability, interference and system bandwidth. This wireless interface protocol structure includes physical layer parameters, transmission and logical passages, higher-layer control commands and paging control. The blood pressure and ECG signals, along with audio and video signals of the patient are measured onboard the ambulance. These signals serve as an input that is encapsulated into a packet using data convergence protocol. The output is transmitted using wireless communication control protocol. A reverse link is used to transmit the patient’s information to the hospital. H.263 video compression encoding was adopted for the transmission of video signals. The service quality parameters of these clinical biomedical signals are listed in Table 2. In addition, the physical layer transmission parameters of these biomedical signals are listed in Table 3. A multi-code CDMA mobile telemedicine system has been studied previously [23], and Niyato et al. [24] discussed an e-health application that uses a WiMAX communication technique.

Table 2 QoS parameters of the telemedicine system [22]
Table 3 Transmission parameters in the telemedicine PHY layer [22]

Short-range wireless telemedicine

Short-range wireless communication techniques include Bluetooth, UWB, wireless local area network, and Zigbee. The transmission distance of these communication schemes range from several centimeters to several hundred meters. In this section, we discuss a Bluetooth telemedicine system [27], BodyLAN wearable sensor telemedicine system [28], and WLAN telemedicine system [29, 30].

Bluetooth telemedicine system

This study [27] discuss the transmission of clinical biomedical signals to a mobile phone from a mobile telemedicine processor using the Bluetooth communication technique. Bluetooth has three classes, each of which has a different maximum transmission power and transmission range. These are, respectively, 100 mW (20 dBm) and 100 m for Class 1, 2.5 mW (4 dBm) and 10 m for Class 2, 1 mW (0 dB) and 1 m for Class 3. The clinical biomedical signals consist of ECG signals of 100 KB, digital X-ray static images of 1 MB and ultrasound images of 30 s of 10 MB. These signals consisting of ECG and EEG signals have a frequencies ranges and signal amplitudes of 0.05–100 Hz, and 10 uV to −5 mV, and 0.5–60 Hz, and 15–100 mV, respectively, signals of 45–200 heart beats per minute and signals of 12–40 respirations per minute.

Wearable body network sensing telemedicine system

A wearable medical sensing system [28] called BodyLAN employs body network to transmit real-time physiological data obtained from a set of biomedical sensors distributed on the body to a PDA or tablet computer. Such systems are compact, light weight, and easy to wear. BodyLAN can be used with wireless local area network, Blue-tooth, RF-ID and Zigbee communication technique. The specification of BodyLAN is listed in Table 4.

Table 4 Specifications of the wearable sensor systems [28]

WLAN telemedicine system

An advanced telemedicine system that employs modern communication techniques including wireless region network 802.11b/a/g/n has been discussed [29, 30]. This study also described the applied platform and wireless telemedicine transmission service, such as paging nurses, voice service, telemedicine remote sensing, bedside patient surveillance, clinical physician chime attention, drug barcode management, electronic medical record/clinical information system application, patient billing and so on over a wireless region network. A DS UWB wireless indoor telemedicine system has also been studied [31]. Ultra-wideband technology is a new technology for short-range high-speed wireless multimedia communication systems. The specifications corresponding to a data rate of 1,320 Mbps and a transmission range of 10 m indicate that DS UWB is a suitable candidate that can be used to develop transmission platforms for a wireless indoor telemedicine system.

Discussion

The invention of the computers has driven the evolution of medical innovations, and the popularity of the Internet has fostered the spread of these innovations; the effects of such a spread have been observed in the case of E-hosptials, where QoS of medicines has increased. In this study, we use a high-speed and a high-reliability mobile communication technique to improve medical services for emergency rescue systems, M-hospitals, personal healthcare, early warning systems of diseases, and illness rehabilitation. Table 5 discusses the mobile telemedicine transmission platform, transmission biomedical media, and the system features. These operating modes include offline, FTP, HTTP, TCP/IP, and WAP modes. MSC technologies, GSM, GPRS, 3G, RF-ID, Zigbee, Bluetooth, UWB, and WLAN are applied to mobile telemedicine systems. These biomedical signals include BP, ECG, EEG, medical reports, HR, SpO2, temperature, medical database, digital X-ray images, audio, and video signals; these signals are transmitted by using MSC, GSM, GPRS, 3G, RF-ID, Zigbee, Blue-tooth, UWB, and WLAN transmission schemes. With the rapidly development of the mobile communications network technology, mobile telemedicine applications are becoming more diverse. Mobile telemedicine is not only a delivery platform for the transmission of biomedical signals, but it is also a tool that assists doctors to overcome the time and space constraints for improving medical services. McDermott et al. [11] discussed blood vessel radiography, and performed family medicine clinical mobile telemedicine experiments. Carlos H. S. et al. [20] performed arterial hypertension, malignant arrhythmias, heart failure, and postinfarction mobile telemedicine experiments. Lee R. G. et al. [25] discussed e-mail, SMS to patients’ mobile phone, SMS to care providers’ mobile phone, ambulance with approval of care provider transmission strategies by using GSM network for normal, urgent, critical, very critical urgency levels mobile telemedicine applications, respectively. Further, mobile telemedicine employs advanced concepts techniques from the fields of electrical engineering, computer science, biomedical engineering, and medicine. Examples of the abovementioned tasks include data compression to reduce the volume of transmitted data; data encryption to protect the privacy of the patients [3638]; and obtaining instantaneous frequency data using approaches such as Hilbert-Huang Transform (HHT) in order to carry out time or frequency domain analysis of the electroencephalograms (EEG) of alcoholic patients [39, 40]. Clinical media contain various types of information such as text, graphics, audio signals, images, and video signals. Further, clinical signals should be acquired using various low-cost and high-accuracy medicine clinical detection instruments, and then, these signals should be transmitted via wired or wireless multimedia communication networks. Telemedicine using wired multimedia networks has been commonly used and well studied in the past. This work [6] discussed the progress in wireless multimedia communication towards mobile telemedicine. Mobile telemedicine is an extension of wired telemedicine. One of the design methods of a mobile telemedicine system is health level 7(HL7)-based design method [3235]. This method can be used to design a mobile telemedicine system operation, as well as develop the information processing stage. An HL7-based design model includes message requirements, message contents, messaging behavior, and message specifications. Mobile telemedicine engineers should be careful to consider the performance of wireless multimedia networks according to the HL7-based design model in order to improve the behavior of physicians, and to increase the quality of medical services.

Table 5 Discussion of mobile telemedicine transmission platforms, transmission biomedical media, and system features

Conclusion

Telemedicine is expected to become ubiquitous in the future, and therefore, it will lead to major developments in the field of medicine and engineering. In addition, telemedicine will lead to several new commercial opportunities, and medical treatments, and it will require druggists, and biomedical engineering equipment manufacturers to change their strategies. In this paper, we have described mobile satellite telemedicine systems, mobile cellular ground telemedicine systems, and short-distance wireless telemedicine systems. From these examples, we study the design concept of mobile telemedicine, and identify the parameters that need to be considered in the design of mobile telemedicine systems. In addition, we can gain understanding of how mobile telemedicine systems have evolved over the last decade. In the future, advanced mobile communication techniques should enable an improvement in the diagnosis and QoS of mobile telemedicine systems.