Neurological disorders represent a particularly challenging use case scenario for IoT in healthcare. To combat the effects of these disorders, such as epileptic seizures or Parkinsonian tremors, there is a need for many distributed sensor signals as well as specifically-targeted and personalized drug delivery. The highly individual nature of indicators, such as respiration, motion, and tremors, and other detectable changes prior to a disease event (e.g. seizure) implies that personalized solutions would have an immense impact on early prevention22. For these reasons, we chose a significantly simplified model of a movement disorder to demonstrate the DNS: movement was exemplified by motion/gestures and therapy was exemplified by highly localized chemical delivery.
The sensory input comprised wearable sensors incorporated into textiles that were able to capture human motion. We chose to use wearable strain sensors, as this type of sensor has been shown previously as a viable system for capturing such motion23. The wearable strain sensors chosen—based on mechanical durability and robustness—were comprised of stretchable carbon electrodes separated by a dielectric (polyurethane, PU), all encased by acrylic tape and polydimethylsiloxane (PDMS) layers and a textile cover (Fig. 2a)24. The sensors exhibit variation in capacitance depending on geometrical changes with low hysteresis characteristics. In brief, as the sensor is stretched along its long axis, the width and thickness are reduced in accordance with the Poisson effect resulting in an overall increased capacitance. Assuming the elastomer behaves as an isotropic and incompressible material, the capacitance follows the ideal parallel-plate model with a linear relationship between capacitance and strain. This model predicts a capacitive gauge factor of 1, defined as (ΔC/C0)/ε where ΔC denotes change in capacitance, C0 initial state capacitance, and ε strain. We investigated the samples’ sensitivity and stretchable capabilities by cycling the sensors at a deformation rate of 5 mm/s with increasing strains. Deformation and relative capacitive changes of sensors of two different lengths, short and long (30 and 50 mm, both 10 mm wide), showed excellent linearity with strain ranging from 0 to 50% and a capacitive gauge factor of 1.0 (r2 = 0.98, Fig. 2b). The sensors maintain their sensitivity for strains up to 50% and exhibit high durability (1000 cycles between 20 and 40% and 30–50% strain repeated twice, Figs. S1, S2). The capacitive sensors exhibit low hysteresis behaviour as quantitatively assessed by calculations of the degree of hysteresis according to previously established methods (Fig. S3 and Table S1)25, i.e., comparing the repetitive loading and unloading of the strain sensor. For 1000 cycles, experiencing the same window of applied strain, the recorded output data between the 5th to 95th percentile falls within a relative capacitance range less than 3%.

Sensor and actuator/delivery nodes. (a) Capacitive strain sensors (short and long types) comprised of a polyurethane dielectric sandwiched by stretchable carbon electrodes encased in rubbers and textiles. (b) Relative change in capacitance (ΔC/C0) under repetitive cycling and increased strains, and the extracted linear response with strain up to 50% (2 devices of each length used/shown). (c) Signals from gesture capture glove with four individual finger sensors during a game of rock-paper-scissors and for incremental finger bending movements. (d) Illustration of an organic electronic ion pump (OEIP) delivery device comprised chiefly of electrodes and an ion exchange membrane (IEM), encapsulated on a flexible or rigid substrate. (e) Visualization of ion delivery by transportation of H+ from a source to a target electrolyte containing a pH indicator. The proton gradient increased and diffused radially away from the outlet during operation (3 V, on for 1 min). (f) Delivery of the neurotransmitter acetylcholine (ACh+, chemical structure inset) controlled by a portable driver unit (measuring charge as integrated current) compared to measured ACh+ amount.
The goal was to integrate wearable sensor nodes to detect physiologically relevant body motion and differentiate between symbolic healthy and unhealthy states, using simplified model systems in place of more advanced healthcare applications such as detecting epileptic seizures. These model systems enabled us to determine the healthcare BAN’s overall abilities and to ascertain the advantages afforded by implementing deep learning on the sensor signals. The first model system to demonstrate monitoring of health parameters comprised a strain sensor fixed to a flexible belt worn across the chest or abdomen to monitor respiration via readouts using an Arduino platform. Sensors placed in both positions recorded distinct magnitude and frequency variations between inhalation and exhalation, and normal and deep breathing (Fig. S4, relative capacitive changes ranging between ~ 5–10% and ~ 25–30%, respectively). In contrast, abnormal breathing, such as shallow/chest breathing, was denoted by a lack of motion response (relative capacitive changes < 1%) in sensors placed on the abdomen, illustrating how multiple positions or separate sensing nodes can be used to monitor and distinguish various basic physiological patterns. An additional model system for monitoring distinguishable body movements and patterns is gesticulation, i.e., hand or finger movement (Fig. 2c). We thus manufactured a gesture capture glove with four wearable sensors attached to the index, middle, ring, and pinky fingers. Again, the sensor response was recorded via an Arduino. In this application, the sensors were strained upon finger bending resulting in relative capacitive changes from 0 to ~ 25% when starting from an open position and proceeding to a closed palm position. From this it is easy to distinguish various gestures in real time, as exemplified by the familiar rock-paper-scissor hand game (Fig. 2c). In future smart wearables, the movement repertoire can of course be expanded and fine-tuned by incorporating more sensor nodes. For example, as the sensors are capable of accurately detecting even small displacements and motions, such dynamic ∆C/C0 patterns would be capable of detecting even small displacements or other abnormal motion patterns.
The organic electronic ion pump (OEIP)26,27 is an iontronic/electrophoretic delivery device exhibiting high electronic and pharmaceutical dose precision and established autoregulation capabilities when coupled to sensors28,29. For instance, electronic control over the delivery of the neurotransmitter γ-aminobutyric acid (GABA) resulted in highly localized suppression of epileptiform activity in rodent hippocampal slice models29. OEIPs are an ideal drug delivery node for the DNS because they are electronically addressable, can accurately translate electronic signals into precise delivery of (charged) biochemical substances (Fig. 2d–f), and have been demonstrated to regulate and provide therapy for neurological disorders including pain30 and epilepsy29. In OEIPs, the number of ions delivered is directly proportional to the time-integrated current (i.e., total charge) passed through the circuit. Based on this principle we manufactured a custom-made portable/wearable OEIP driver with current integrator circuitry. When activated, the unit supplied a constant voltage (0–5 V) to the OEIPs for a set amount of time or set quantity of charge delivered. The multipurpose and adaptable driver solution allowed operation of a wide range of OEIPs, which is important as pump designs and their requirements (voltage, delivery dynamics, etc.) vary depending on the application. The operation and control of OEIPs were tested using the custom driver as well as traditional source-meter units. OEIPs were either microfabricated by photolithographical patterning techniques31 or screen-printed32 as previously established.
OEIPs with a relatively short and wide channel (1–2 mm and 200–500 µm, respectively) exhibited the most noticeable visual response of actuation. The on-state of a pump (applied voltage of 3 V) was displayed through transportation of protons (H+) from the source reservoir containing 10 mM HCl(aq) to the target electrolyte containing a pH indicator (Fig. 2e; Supplementary Movie 1). This resulted in H+ gradients and nearly instant color change from yellow to red at the outlet, a pattern which diffused radially outward over time. Next, the controller circuit’s capabilities were tested by driving the OEIP at 4 V to transport the neurotransmitter acetylcholine (ACh+) from the source to the target. The charge limit, i.e., time-integrated current, was set over the range 100–200 µC and compared to the actual amount of delivered ions in the target solution, quantified using a fluorometric assay. Figure 2f shows an excellent linear correlation between the set and measured amount of ACh+, revealing that the driver enabled direct control over delivery via determined charge limits.
A body-coupled communication (BCC) system (BodyCom system, Microchip Technology Inc.) was explored as a secure and energy-efficient BAN communication pathway. Our modified version of the system allowed for transmission of data between separate sensor nodes located at several locations on the body, granted that they were in close proximity (touching or within a few centimetres) to the same human body skin33. This custom setup thus offers a user-friendly secure connection between the sensing elements, which inform healthcare decisions, and actuating devices, which carry out appropriate therapies (e.g., delivery of relevant neurotransmitters). The electrical signal is propagated along the surface of the skin by means of a capacitive field as the communication pathway to transfer short messages, sensor/actuator data, keys, or identity information between BCC mobile tags and a base unit (Fig. S5). For this purpose, sensor responses were collected, locally processed (via Arduino) and translated into short messages, e.g., representing different states such as “one finger bent”, “open hand”, or even raw sensor data. The overall BCC systems used in our experiments consists of a base unit, provided by Microchip but with modified software, and a BCC mobile unit, usually referred to as the “BCC tag”, which we have designed and manufactured. In a typical setup there is one base unit managing communication between several mobile units using the capacitive coupling technique. The base unit continuously requested and received sensor data as well as transmitted this information to the actuator BCC tag for decision-making to trigger the corresponding action.
Additionally, data collected by the base unit was communicated to external mobile devices via Bluetooth. The sensor and actuator data could thus be forwarded to a dedicated cloud data handling service for storage, analysis, training of machine learning models, and eventually prediction via these models for improved detection limits and therapies (Fig. 3a). Here, a Hopsworks34 cloud platform was utilized and further developed to streamline the flow and handing of large e-health data, as more nodes can be coupled to the BAN system to increase available health monitoring parameters. Data gathered from individual users, or from all users, can be fed into the platform to optimize the decision-making model and prediction accuracy of the neural network and improved models can thereafter be downloaded from the cloud to the BAN nodes. An instructive experiment was carried out in which 16 willing individuals wore the motion-capture glove while repetitively cycling through a determined list of five possible gestures. Notably, the individual users’ bending response ranges and patterns varied, partially due to the fit of the glove (Fig. S6). The collected data was then used to train a two-layer classification network implemented in TensorFlow. After running the hyperparameter exploration, an accuracy of 97% was reached (Fig. 3b). In contrast, our first solution to predict the hand position, i.e., setting a stretch threshold of 7% to correspond to a finger bending, had an accuracy of 53% and finding an optimal threshold (3%) with hindsight only provided an accuracy of 82%. To illustrate the advantage of continuously gathering data and training, we measured the evolution of the accuracy with respect to the quantity of data used for the training (Fig. 3b). This scenario was rather simple, leading to a rapid increase in accuracy when modelled despite variations in how well the glove fit the participant and associated variations in strain recordings depending on hand size. However, considerably more complex and medically relevant questions, e.g., epileptic seizure prediction, will require larger data sets and distribution of sensors.

Digital nervous system integration and implementation of machine learning. (a) The DNS proof-of-concept, where the sensing node (glove) state was received by the base unit/tag (in contact with person, but out-of-view during filming) and forwarded to the actuator node (ion pump) through the capacitive field of the body. The received state was further sent from the base unit to the mobile device via Bluetooth for interaction with the Hopsworks cloud-based machine learning system (steps 1–4 and a–e). (b) Accuracy of a neural network classification of the hand position as a function of the quantity of data used to train it, compared to the accuracy of an initial estimated threshold and an optimized threshold corresponding to bent fingers. (c) Accuracy of a neural network classification of the hand position as a function of the simulated drift in sensor measurements compared to the training data. The drift was applied on one finger (grey) or all fingers (orange) at the same magnitude. Photographs in part a taken by Thor Balkhed at Linköping University, and used with permission.
In addition to improved accuracy compared to empirically chosen decision-making thresholds for sensor data, deep learning provides robustness. To demonstrate this, two separate experiments were carried out: a sensor drift simulation (e.g., due sensor degradation/aging) and an ablation test (e.g., complete loss of one of the sensor signals). The simulation showed that the deep learning system can handle up to 3% drift without losing less than one point of accuracy (Fig. 3c). The loss in accuracy thereafter increases linearly with higher drift. In comparison, applying a drift larger than 3% for a threshold-based system (with a threshold of 3%) rapidly results in accuracy of 0. Moreover, continuously sending data to Hopsworks would allow the system to detect the drift and to train the deep learning system to adapt accordingly. The accuracy after training with drift taken into account was 97%. With a threshold-based approach the ablation of one sensor results in the impossibility of differentiating between two hand positions. The accuracy is at best 80% and software correction is not possible. With the deep learning system, the accuracy also falls under 80%. However, retraining the system allows rapid detection of the absence of one of the sensors, and after the retraining an accuracy of 89% is obtained. This is because the bending of one finger also modifies the position of the other fingers and the deep learning system can detect and use even slight finger position modifications to improve the accuracy. This highlights the importance of having multiple sensors measuring different metrics for both increased accuracy and redundancy. It also becomes apparent that the deep learning system can detect and learn from complex and subtle patterns, as well as handle sensor failure thanks to its capacity to identify hidden patterns.
The DNS proof-of-concept was achieved by bringing together all the components in a fully functional wireless closed loop system, as visualized in Fig. 3a (Supplementary Movie 1). The sensor node comprised the gesture glove, Arduino controller, and BCC-tag. The actuator node comprised the OEIP, Arduino controller, BCC-tag, and additional indicator LEDs. Gestures corresponded to different states that each triggered a specific action within the BAN, as visualized in Fig. 4. Specifically, counting on one, two, or three fingers (Fig. 4i–iii) corresponded to setting three drug delivery levels (charge limits of 100, 150, or 200 µC, respectively); the “rocker” gesture (Fig. 4iv) signalled activation of the pump (on-state until the specified amount had been delivered); open palm paused the action (Fig. 4v); and closed fist (Fig. 4vi) cleared all commands and brought the system back to its default off-state. The base unit continuously requests the status from all mobile tags coupled to the body (Supplementary Movie 2). Upon a change of the state in the glove multi-sensor node, the base unit pushed the state to the actuator BCC tag (in real-time), while simultaneously sending the information to a mobile device via Bluetooth to visually confirm the state alteration and to relay information to the cloud for machine learning and progressively improved decision-making. During the visual confirmation, a delay between the hand gesture and display on the mobile device was noted (Supplementary Movie 1), which largely stemmed from a non-optimized Bluetooth connection between the BCC base unit and mobile device. As can also been seen (Supplementary Movie 1), there was no delay between the hand gesture and activation at the actuator tag (OEIP). Once the actuator tag receives a specific state, it transmits the corresponding decision for action to the controller unit to engage the OEIP according to the previously described protocols. OEIP operation was visually aided by an array of three LEDs, where setting of three drug delivery levels was represented by turning on one, two, or three LEDs (indicating 100, 150, or 200 µC, Fig. 4i–iii); the on-state was displayed by blinking (Fig. 4iv); and all LEDs were turned off as the system was cleared (Fig. 4vi). As soon as a given pumping sequence was completed, the status of the actuator node changed and was detected upon the next status request by the base unit and thereafter relayed to the mobile device via Bluetooth. Additionally, all data received by the mobile device could be uploaded to the cloud service. The stored sensor and actuator data can thereby be accessed by the user or their physician, and further evaluated to improve detection and treatment schemes.

Demonstration of the digital nervous system (DNS) proof-of-concept. Time-lapse and corresponding image panels (system overview, sense and actuator nodes; Supplementary Movie 1) displaying: setting the drug dispensing levels by counting to three (i–iii), pump activation by ‘rocker’ gesture (iv), pausing (v), and clearing the system (vi) and associated LED array responses next to an inserted ion pump. Photographs taken by Thor Balkhed at Linköping University, and used with permission.
link